WO2018094534A1 - System, method and/or computer readable medium for non-invasive workflow augmentation - Google Patents

System, method and/or computer readable medium for non-invasive workflow augmentation Download PDF

Info

Publication number
WO2018094534A1
WO2018094534A1 PCT/CA2017/051418 CA2017051418W WO2018094534A1 WO 2018094534 A1 WO2018094534 A1 WO 2018094534A1 CA 2017051418 W CA2017051418 W CA 2017051418W WO 2018094534 A1 WO2018094534 A1 WO 2018094534A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
data
image
information
image capture
Prior art date
Application number
PCT/CA2017/051418
Other languages
French (fr)
Inventor
Arash Abadpour
Bahareh Gholamzadeh AJEZ
Graham GREENLAND
Original Assignee
Fio Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fio Corporation filed Critical Fio Corporation
Priority to CA3083851A priority Critical patent/CA3083851A1/en
Publication of WO2018094534A1 publication Critical patent/WO2018094534A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H10/00ICT specially adapted for the handling or processing of patient-related medical or healthcare data
    • G16H10/60ICT specially adapted for the handling or processing of patient-related medical or healthcare data for patient-specific data, e.g. for electronic patient records
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H80/00ICT specially adapted for facilitating communication between medical practitioners or patients, e.g. for collaborative diagnosis, therapy or health monitoring
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2354/00Aspects of interface with display user
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/02Networking aspects
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2380/00Specific applications
    • G09G2380/08Biomedical applications
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/001Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
    • G09G3/002Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background to project the image of a two-dimensional display, such as an array of light emitting or modulating elements or a CRT
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/10ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to drugs or medications, e.g. for ensuring correct administration to patients
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Definitions

  • the present invention relates generally to apparatuses and techniques for projecting data for computing devices and, more particularly, for projecting augmented data associated with an object.
  • a user interface generally communicates information or data for business, personal, or other purposes thereby allowing users to take advantage of the value of the information. Because technology and information handling needs and requirements vary between different users or applications, user interfaces may also vary regarding what information is handled, how the information is handled, how much information is processed, stored, or communicated, and how quickly and efficiently the information may be processed, stored, or communicated.
  • user interfaces allow for user interfaces to be general or configured for a specific user or specific use such as health service provisioning, financial transaction processing, enterprise data storage, or global communications.
  • user interfaces may include a variety of hardware and software components that may be configured to process, store, and communicate information and may include one or more computer systems, data storage systems, and networking systems.
  • a keyboard projected on a desktop surface accepts user inputs to projected keys instead of physical keys.
  • Structured infrared light projected over the desktop aids the camera in detecting inputs at keys by reflected infrared light that results when an input is made at a key.
  • Each of these input and output devices work independently from each other based upon an end user election to interact with an information handling system using a particular input or output device.
  • test results For example in the health care industry, it may be desirable to provide a system which is capable of interpreting test results by placing, for example, a rapid diagnostic tests ("RDTs") onto a surface and having such test read in situ and have projected onto those tests a basic and intuitive interpretation of the results, for instance a green border for good results and another color for tests in which it was determined the testing was conducted improperly, and appropriate next steps including possible treatments based on medical records retrieved automatically from the system from a database. Further, it may be desirable in such an industry to minimize physical contact between the end user and the surface or tests to reduce to reduce contact with possibly infected samples or surfaces.
  • RTTs rapid diagnostic tests
  • the present disclosure provides a system, method and/or computer-readable medium for presenting augmented data. More specifically, embodiments of the present invention are directed to a system, method, and/or computer-readable medium for generating and displaying augmented data to enhance one or more workflows for a user.
  • the system, method and/or non-transitory computer-readable medium includes an image capture device for capturing an image of the object within an image capture area, the image including object data. Also included is a database comprising supplementary data associated with the object.
  • a work area processor receives the image, accesses the database, and automatically uses the object data and the supplementary data to generate augmented data associated with the object.
  • a presentation device is provided for presenting the augmented data associated with the object by displaying the augmented data on the surface within a presentation area.
  • the system, method and/or non-transitory computer readable medium is operative to facilitate non-invasive augmentation of the one or more workflows for the user.
  • the object comprises a face, a label, an RDT, a body part, packaging, and/or a drug.
  • At least a portion of the presentation area overlaps with the image capture area.
  • a cover supported by the surface can be used to facilitate registration of the presentation area and the image capture area in relation to the surface.
  • the cover provides fiducials to enhance registration.
  • the presentation device and the image capture device are elevated in relation to the surface.
  • the augmented data is adapted for user interaction.
  • the augmented data comprises patient identification, color coding, a user interface, medical information, patient history, training information, and/or drug information.
  • FIG. 1 is an illustration of an embodiment of the present invention
  • FIG. 2 is an illustration of a yet another embodiment of the present invention.
  • FIG. 3 is an illustration of a yet another embodiment of the present invention.
  • FIG. 4 is an illustration of a yet another embodiment of the present invention.
  • FIG. 5 is a flow chart of an embodiment of the present invention.
  • FIG. 6 is an illustration of a yet another embodiment of the present invention.
  • FIG. 7 is an illustration of a yet another embodiment of the present invention.
  • FIG. 8 is an illustration of a yet another embodiment of the present invention.
  • FIG. 9 is an illustration of a yet another embodiment of the present invention.
  • FIG. 10 is an illustration of a yet another embodiment of the present invention.
  • FIG. 11 is an illustration of a yet another embodiment of the present invention.
  • FIG. 12 is an illustration of a yet another embodiment of the present invention.
  • FIG. 13 is an illustration of a yet another embodiment of the present invention.
  • FIG. 14 is an illustration of a yet another embodiment of the present invention.
  • FIG. 15 is an illustration of a yet another embodiment of the present invention.
  • FIG. 16 is an illustration of a yet another embodiment of the present invention.
  • FIG. 17 is an illustration of a yet another embodiment of the present invention.
  • FIG. 18 is an illustration of a yet another embodiment of the present invention.
  • the present invention can be implemented in numerous ways, including as a process, method, an apparatus, a system, a device, a method, or a computer readable medium such as a computer readable storage medium or a computer network wherein program instructions are sent over a network (e.g. optical or electronic communication links).
  • a network e.g. optical or electronic communication links
  • these implementations, or any other form that the invention may take, may be referred to as processes.
  • the order of the steps of the disclosed processes may be altered within the scope of the invention.
  • Preferred embodiments of the present invention can be implemented in numerous configurations depending on implementation choices based upon the principles described herein.
  • Various specific aspects are disclosed, which are illustrative embodiments not to be construed as limiting the scope of the disclosure.
  • One aspect of the disclosure is a method, computer program product, apparatus, and system for the exchange of data between two or more cloud systems.
  • the present specification describes components and functions implemented in the embodiments with reference to standards and protocols known to a person skilled in the art, the present disclosures as well as the embodiments of the present invention are not limited to any specific standard or protocol.
  • the Internet is a global computer network which comprises a vast number of computers and computer networks which are interconnected through communication links.
  • an electronic communications network of the present invention may include, but is not limited to, one or more of the following: a local area network, a wide area network, peer to peer communication, an intranet, or the Internet.
  • the interconnected computers exchange information using various services, including, but not limited to, electronic mail, Gopher, web-services, application programming interface (API), File Transfer Protocol (FTP).
  • API application programming interface
  • FTP File Transfer Protocol
  • This network allows a server computer system (a Web server) to send graphical Web pages of information to a remote client computer system. The remote client computer system can then display the Web pages via its web browser.
  • Each Web page (or link) of the "world wide web" (“WWW”) is uniquely identifiable by a Uniform Resource Locator (URL).
  • URL Uniform Resource Locator
  • a client computer system specifies the URL for that Web page in a request (e.g., a Hypert ext Transfer Protocol ("HTTP") request).
  • HTTP Hypert ext Transfer Protocol
  • the request is forwarded to the Web server that supports the Web page.
  • the Web server receives the request, it sends the Web page to the client computer system.
  • the client computer system When the client computer system receives the Web page, it typically displays the Web page using a browser.
  • a web browser or a browser is a special-purpose application program that effects the requesting of web pages and the displaying of web pages and the use of web-based applications.
  • Commercially available browsers include Microsoft Internet Explorer and Firefox, Google Chrome among others. It may be understood that with embodiments of the present invention, any browser would be suitable.
  • Web pages are typically defined using HTML.
  • HTML provides a standard set of tags that define how a Web page is to be displayed.
  • the browser sends a request to the server computer system to transfer to the client computer system an HTML document that defines the Web page.
  • the browser displays the Web page as defined by the HTML document.
  • the HTML document contains various tags that control the displaying of text, graphics, controls, and other features.
  • the HTML document may contain URLs of other Web pages available on that server computer system or other server computer systems.
  • a person skilled in the relevant art may generally understand a web-based application refers to any program that is accessed over a network connection using HTTP, rather than existing within a device's memory.
  • Web-based applications often run inside a web browser or web portal.
  • Web-based applications also may be client-based, where a small part of the program is downloaded to a user's desktop, but processing is done over the Internet on an external server.
  • Web-based applications may also be dedicated programs installed on an internet-ready device, such as a smart phone or tablet.
  • a person skilled in the relevant art may understand that a web site may also act as a web portal.
  • a web portal may be a web site that provides a variety of services to users via a collection of web sites or web based applications.
  • a portal is most often one specially designed site or application that brings information together from diverse sources in a uniform way.
  • each information source gets its dedicated area on the page for displaying information (a portlet); often, the user can configure which ones to display.
  • Portals typically provide an opportunity for users to input information into a system.
  • Variants of portals include "dashboards". The extent to which content is displayed in a "uniform way" may depend on the intended user and the intended purpose, as well as the diversity of the content. Very often design emphasis is on a certain "metaphor" for configuring and customizing the presentation of the content and the chosen implementation framework and/or code libraries.
  • the role of the user in an organization may determine which content can be added to the portal or deleted from the portal configuration.
  • a portable electronic device refers to any portable electronic device that can be used to access a computer network such as, for example, the internet.
  • a portable electronic device comprises a display screen, at least one input/output device, a processor, memory, a power module and a tactile man-machine interface as well as other components that are common to portable electronic devices individuals or members carry with them on a daily basis.
  • portable devices suitable for use with the present invention include, but are not limited to, smart phones, cell phones, wireless data/email devices, tablets, PDAs and MP3 players, test devices, etc.
  • network ready device or “internet ready device” refers to devices that are capable of connecting to and accessing a computer network, such as, for example, the Internet, including but not limited to an IoT device.
  • a network ready device may assess the computer network through well-known methods, including, for example, a web-browser.
  • Examples of internet- ready devices include, but are not limited to, mobile devices (including smart-phones, tablets, PDAs, etc.), gaming consoles, and smart-TVs. It may be understood by a person skilled in the relevant art that embodiment of the present invention may be expanded to include applications for use on a network ready device (e.g. cellphone).
  • the network ready device version of the applicable software may have a similar look and feel as a browser version but that may be optimized to the device. It may be understood that other "smart" devices (devices that are capable of connecting to and accessing a computer network, such as, for example, the internet) such as medical or test devices, including but not limited to smart blood pressure monitors, smart glucometers, IoT devices, etc.
  • downloading refers to receiving datum or data to a local system (e.g. mobile device) from a remote system (e.g. a client) or to initiate such a datum or data transfer.
  • a remote system or clients from which a download might be performed include, but are not limited to, web servers, FTP servers, email servers, or other similar systems.
  • a download can mean either any file that may be offered for downloading or that has been downloaded, or the process of receiving such a file.
  • a person skilled in the relevant art may understand the inverse operation, namely sending of data from a local system (e.g. mobile device) to a remote system (e.g.
  • a database may be referred to as "uploading".
  • the data and/or information used according to the present invention may be updated constantly, hourly, daily, weekly, monthly, yearly, etc. depending on the type of data and/or the level of importance inherent in, and/or assigned to, each type of data.
  • Some of the data may preferably be downloaded from the Internet, by satellite networks or other wired or wireless networks.
  • computers include a central processor, system memory, and a system bus that couples various system components including the system memory to the central processor.
  • a system bus may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures.
  • the structure of a system memory may be well known to those skilled in the art and may include a basic input/output system ("BIOS") stored in a read only memory (“ROM”) and one or more program modules such as operating systems, application programs and program data stored in random access memory (“RAM”).
  • BIOS basic input/output system
  • ROM read only memory
  • RAM random access memory
  • Computers may also include a variety of interface units and drives for reading and writing data.
  • a user of the system can interact with the computer using a variety of input devices, all of which are known to a person skilled in the relevant art.
  • Computers can operate in a networked environment using logical connections to one or more remote computers or other devices, such as a server, a router, a network personal computer, a peer device or other common network node, a wireless telephone or wireless personal digital assistant.
  • the computer of the present invention may include a network interface that couples the system bus to a local area network ("LAN").
  • LAN local area network
  • Networking environments are commonplace in offices, enterprise-wide computer networks and home computer systems.
  • a wide area network (“WAN”) such as the Internet, can also be accessed by the computer or mobile device.
  • connection contemplated herein are exemplary and other ways of establishing a communications link between computers may be used in accordance with the present invention, including, for example, mobile devices and networks.
  • the existence of any of various well-known protocols, such as TCP/IP, Frame Relay, Ethernet, FTP, HTTP and the like, may be presumed, and computer can be operated in a client-server configuration to permit a user to retrieve and send data to and from a web- based server.
  • any of various conventional web browsers can be used to display and manipulate data in association with a web based application.
  • the operation of the network ready device may be controlled by a variety of different program modules, engines, etc.
  • program modules are routines, algorithms, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types.
  • program modules may also be practiced with other computer system configurations, including multiprocessor systems, microprocessor-based or programmable consumer electronics, network PCS, personal computers, minicomputers, mainframe computers, and the like.
  • the invention may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network.
  • program modules may be located in both local and remote memory storage devices.
  • Embodiments of the present invention may include a computer system.
  • the computer system can be a personal computer, mobile device, notebook computer, server computer, mainframe, networked computer (e.g., router), workstation, and the like.
  • the computer system includes a processor coupled to a bus and memory storage coupled to the bus.
  • the memory storage can be volatile or non-volatile (i.e. transitory or non-transitory) and can include removable storage media.
  • the computer can also include a display, provision for data input and output, etc. as may be understood by a person skilled in the relevant art.
  • FIG. 18 illustrates a more detailed diagram of an example computing device 400 within which a set of instructions, for causing the computing device to perform any one or more of the methods discussed herein, may be executed.
  • the computing device 400 may include additional or different components, some of which may be optional and not necessary to provide aspects of the present disclosure.
  • the computing device may be connected to other computing device in a LAN, an intranet, an extranet, or the Internet.
  • the computing device may operate in the capacity of a server or a client computing device in client-server network environment, or as a peer computing device in a peer-to-peer (or distributed) network environment.
  • the computing device may be a provided by a personal computer (PC), a tablet PC, a set-top box (STB), a Personal Digital Assistant (PDA), a cellular telephone, or any computing device capable of executing a set of instructions (sequential or otherwise) that specify operations to be performed by that computing device.
  • PC personal computer
  • PDA Personal Digital Assistant
  • STB set-top box
  • a cellular telephone or any computing device capable of executing a set of instructions (sequential or otherwise) that specify operations to be performed by that computing device.
  • the term "computing device” shall also be taken to include any collection of computing devices that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein.
  • Exemplary computing device 400 includes a processor 402, a main memory 404 (e.g., read-only memory (ROM) or dynamic random access memory (DRAM)), and a data storage devices 414, which communicate with each other via a bus 426.
  • main memory 404 e.g., read-only memory (ROM) or dynamic random access memory (DRAM)
  • DRAM dynamic random access memory
  • Processor 402 may be represented by one or more general-purpose processing devices such as a microprocessor, central processing unit, or the like. More particularly, processor 402 may be a complex instruction set computing (CISC) microprocessor, reduced instruction set computing (RISC) microprocessor, very long instruction word (VLIW) microprocessor, or a processor implementing other instruction sets or processors implementing a combination of instruction sets. Processor 402 may also be one or more special-purpose processing devices such as an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), a digital signal processor (DSP), network processor, or the like. Processor 402 is configured to execute instructions 424 for performing the operations and functions discussed herein.
  • ASIC application specific integrated circuit
  • FPGA field programmable gate array
  • DSP digital signal processor
  • Computing device 400 may further include a network interface device 406, image capture devices (e.g. video cameras, I cameras, depth-sensing cameras, or multi camera based systems) 410, a video display unit (e.g. an LCD screen or projection device) 420, a character input device 418 (e.g., a keyboard), and a touch screen input device 416.
  • image capture devices e.g. video cameras, I cameras, depth-sensing cameras, or multi camera based systems
  • video display unit e.g. an LCD screen or projection device
  • character input device 418 e.g., a keyboard
  • touch screen input device 416 e.g., a touch screen input device
  • Data storage device 414 may include a computer-readable storage medium 412 on which is stored one or more sets of instructions 424 embodying any one or more of the methodologies or functions described herein. Instructions 424 may also reside, completely or at least partially, within main memory 404 and/or within processor 402 during execution thereof by computing device 400, main memory 404 and processor 402 also constituting computer-readable storage media. Instructions 424 may further be transmitted or received over network 408 via network interface device 406.
  • a preferred embodiment of the present invention provides an information handling system to create end user immersed environment by managing the presentation of information and acceptance of inputs through a user interface engine, such as instructions executing in one or more of CPU, chipset and individual input and output components like those described below.
  • a display interfaces with a graphics system to receive pixel information that is presented as visual images. Displays may be configured as an integrated device, such as in a tablet, laptop or convertible information handling system, or as a peripheral device coupled through a cable or wireless interface.
  • a Projector is, for example, essentially a display device that projects pixel information against a projection surface, such as a desktop surface.
  • An infrared emitter projects infrared light to aid in touch detection and/or resolution of three dimensional images captured by a camera.
  • an infrared emitter projects an I curtain just above a surface to detect touches at the surface that pass through the IR curtain.
  • an infrared emitter may projected structured light that does not disrupt the end user but provides camera with depth information that aids analysis of images to detect body parts, such as fingers and hands, and dimensions.
  • User interface engine manages inputs and outputs to other types of devices, including a mouse, keyboard, camera and touchscreen.
  • User interface engine applies images captured by camera with and without enhancements by infrared emitter to manage passive input devices, such as totems that an end user manipulates to indicate an input.
  • User interface engine tracks input and output devices relative to applications, windows and content data with a user interface table that allows rapid transitions of presented content between available output devices.
  • application or "application software” to refer to a program or group of programs designed for end users. While there are system software, typically but not limited to, lower level programs (e.g. interact with computers at a basic level), application software resides above system software and may include, but is not limited to database programs, word processors, spreadsheets, etc. Application software may be grouped along with system software or published alone. Application software may simply be referred to as an "application”.
  • references to the "transmission”, “processing”, “interpretation” or the like of data associated with a cloud may refer to advancing through logic contained in the guideline. This may be accomplished, among other methods, by running on a processor one or more computer programs representative of the algorithms, processes, etc.
  • one or more non-invasive workflow augmentation systems, methods, computer-readable media, and/or cooperating environments may be disclosed.
  • the invention is contemplated for use in association with one or more cooperating environments, to afford increased functionality and/or advantageous utilities in association with same.
  • the invention is not so limited.
  • One or more of the disclosed steps, algorithms, processes, features, structures, parts, components, modules, utilities, relations, configurations, and the like may be implemented in and/or by the invention, on their own, and/or without reference, regard or likewise implementation of one or more of the other disclosed steps, algorithms, processes, features, structures, parts, components, modules, utilities, relations, configurations, and the like, in various permutations and combinations, as may be readily apparent to those skilled in the art, without departing from the pith, marrow, and spirit of the disclosed invention.
  • instructions 424 may include instructions for noninvasive workflow augmentation systems.
  • computer-readable storage medium 412 is shown in the example of FIG. 18 to be a single medium, the term “computer-readable storage medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions.
  • the term “computer-readable storage medium” shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present disclosure.
  • the term “computer-readable storage medium” shall accordingly be taken to include, but not be limited to, solid-state memories, optical media, and magnetic media.
  • the methods, components, and features described herein may be implemented by discrete hardware components or may be integrated in the functionality of other hardware components such as ASICS, FPGAs, DSPs or similar devices.
  • the methods, components, and features may be implemented by firmware modules or functional circuitry within hardware devices.
  • the methods, components, and features may be implemented in any combination of hardware devices and software components, or only in software.
  • the present disclosure also relates to an apparatus for performing the operations herein.
  • This apparatus may be specially constructed for the required purposes, or it may comprise a general purpose computer selectively activated or reconfigured by a computer program stored in the computer.
  • a computer program may be stored in a computer readable storage medium, such as, but not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, and magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs), EPROMs, EEPROMs, magnetic or optical cards, or any type of media suitable for storing electronic instructions.
  • the present invention is adapted to provide noninvasive workflow augmentation.
  • the present invention is preferably adapted to provide non-invasive workflow augmentation in the delivery of healthcare services.
  • an interface is preferably presented on a surface for interaction with a user and/or an object.
  • a presentation device 104 e.g., a projector, laser, or beamer for projecting both visible and/or invisible light which may include infrared light
  • a user 20 e.g., a user interface, identification information., test results, etc.
  • an image capture device 102 e.g., a camera
  • the presentation device 104 and image capture device 102 are provided as a combined device 106.
  • the presentation device 104 and image capture device 102 are positioned in a similar direction - i.e., at least a portion of the presentation area 40a,b overlaps with the image capture area 42a,b.
  • the presentation device 104 and the image capture device 102 are associated with optical axes (40a,b and 42a,b respectively) that are close to each other to provide a significant area of mutual coverage or overlap.
  • the user interface comprises a menu bar (not shown) with various interactive options for the user 20 including, but not limited to, patient data, device settings, help, etc.
  • the presentation device 104 and the image capture device 102 are preferably registered and/or calibrated to one other (e.g., to optimally align and/or position the respective optical axes 40a,b and 42a,b) prior to use by the user (e.g., during the device 106 manufacturing process).
  • the device 106 prior to use by the user 20, registers and/or calibrates the optical axes 40a,b and 42a,b relative to a cover 108 supported by a surface 10 (e.g., a table, a floor, etc.).
  • the cover 108 provides one or more fiducials 44 (e.g., a distinct symbol, design, grid, etc.) to serve as a marker in the field of view of the image capture device 102 - the markers appearing in the image captured by the device 102 - for use as a point of reference or a unit of measure.
  • the presentation device 104 preferably serves as a light source for the image capture device 102.
  • the presentation device 104 and the image capture device 102 are positioned above the cover 108 via a support 110 that may be fixed (e.g., clamped to the surface, affixed to a vertical support secured to the surface, overhanging the cover, etc.) or flexible (e.g., gooseneck, hinged vertical support, affixed to a vertical support unsecured to the surface, etc.).
  • a support 110 may be fixed (e.g., clamped to the surface, affixed to a vertical support secured to the surface, overhanging the cover, etc.) or flexible (e.g., gooseneck, hinged vertical support, affixed to a vertical support unsecured to the surface, etc.).
  • the support 110 preferably positions the device 106 so that it is elevated relative to the cover 108.
  • the device processor 1 12 is associated with the device 106 and in preferable embodiments is combined with the device 106.
  • a workflow area 52 is provided by the user 20 and is the area 52 wherein the workflows are conducted.
  • the device 106 is non-intrusive with respect to the workflows.
  • the device 106 preferably merges with the desired workflows of the user 20 with minimal disruption. Minimum training and modification of workflows is required.
  • the system 100 in accordance with the present invention includes the work area subsystem 50, an administrator subsystem 70, and databases 80a,b,c (collectively "databases 80").
  • the work area subsystem 50 for use by the user 20 and/or a patient 30, includes the image capture device 102 and the presentation device 104 (collectively, the device 106) for use with the objects 22.
  • the device 106 is preferably elevated by the support 1 10 relative to the cover 108 supported by the surface 10.
  • the presentation device 104 presents content 114 (alternately "augmented data 1 14") on the cover 108.
  • the device processor 112, a secondary device 1 16 (e.g., tablet, laptop, smartphone, etc.) and work area database 118 are also provided in the work area 50.
  • the administrator subsystem 70 includes an administrator processor 72, an administrator database 74 and an administrator report 76.
  • the system 100 is shown in use with one or more communication networks 100.
  • the communication networks 100 may include satellite networks (e.g., GPS networks), terrestrial wireless networks, and the Internet.
  • satellite networks e.g., GPS networks
  • terrestrial wireless networks e.g., GPS networks
  • the Internet e.g., GSM networks
  • Persons having ordinary skill in the art will appreciate that the system 100 includes hardware and software.
  • FIG. 4 schematically illustrates, among other things, that the work area subsystem 50 includes the work area processor 1 12, input-output devices 120 including the presentation device 104, the image capture device 102 and a speaker 123 for providing the user with audible information. Also included is the work area database 118, the secondary device 1 16, and a work area receiver-transmitter device 122.
  • a computer readable medium e.g., a work area processor-readable memory 130 which stores one or more algorithms 801 , 802, 803, 804 and/or 808 operative ly encoded for use by the work area processor 1 12.
  • the algorithms 801 , 802, 803, 804 and/or 808 provide the processor 1 12 with registration logic 801 , recognition logic 802, image capture logic 803, presentation logic 804, and/or object identification logic 808.
  • the administrator subsystem 70 includes the administrator processor 72, the administrator database 74, an input-output device 75, and an administrator receiver- transmitter 76.
  • a computer readable medium e.g., an administrator processor-readable memory
  • the algorithms 805, 806 and/or 807 provide the processor 72 with analysis logic 805, report generation logic 806 and/or work area control logic 307.
  • the device 106 preferably detects and registers (and/or calibrates with respect to) the cover 108 prior to performing an action.
  • the device 106 presents (e.g., by projection) one or more patterns 1 14 on the cover 108 to carry out the registration and/or calibration process.
  • the device 106 can preferably tolerate movements during use which may be caused, for example, by a user or patient hitting the surface.
  • the device 106 utilizes fiducials 44 displayed at predetermined coordinates on the cover 108 for registration.
  • the device 106 preferably augments objects 22 positioned on (and/or associated with) the cover 108 with augmented data 1 14 that is projected from the presentation device 104.
  • the device 106 augments one or more objects 22 placed on the cover 108 with various augmented data 114 including, but not limited to, descriptive text and graphics, notices, warnings and instructions.
  • the device 106 preferably provides the user 20 with a natural interface and may be especially advantageous for users 20 with minimal educational background, healthcare training and/or low digital literacy levels.
  • augmented data 114 is preferably generated using the object data (not shown) and the supplementary data (not shown) associated the object from the one or more databases.
  • the device 106 interacts with the user 20 by presenting augmented data 114 on the cover 108 that is interactive (e.g., buttons and/or menus).
  • the device 106 engages the user 20 using a natural interface (e.g., hand gestures).
  • the user 20 may be prompted to interact with projected augmented data 114, such as a projected button and/or select an item from a projected menu.
  • the user 20 is prompted to provide the device 106 with inputted information by, for example, writing on the cover 108 (e.g., via a finger, an instrument such as a pen, etc.).
  • the user 20 is not required to excessively physically handle a particular object and may dispose of the cover 108 after conducting an activity in order to promote hygiene and reduce the likelihood of contaminating the site, patient, user, and/or objects (e.g., RDTs).
  • preferred embodiments of the present invention reduce the frequency of conducting decontamination procedures for physical objects (e.g., a reader, a keyboard, a notebook, etc.).
  • preferred embodiments promote non-invasive workflow augmentation for users.
  • the user 20 preferably performs traditional activities that the device 106 preferably augments (e.g., by observing, recording, and/or providing guidance to the user).
  • FIG. 5 depicts a workflow or method 200 which includes the following steps: a registration step 202, a user identification step 204, a patient identification step 206, an object identification step 208, a content presentation step 210, an activity step 212, a cover disposal step 214 and/or a step 216 of accessing (including uploading and downloading) cover registration data, user identification data, patient identification data, object identification data, content presentation data, and/or activity data from the databases 74, 80, 1 18.
  • the registration step 202 may include the registration of the cover 108 using the fiducials 44 and the establishment of a wireless connection with the secondary device 1 16.
  • the user identification step 204 includes the use of the device 106 to identify the user 20 via, for example, the facial recognition algorithm 802 with access to databases 74, 80, 1 18 using the database access step 216.
  • the patient identification step 206 includes the use of the device 106 to identify the patient 30 via, for example, the facial recognition algorithm 802 with access to the databases 74, 80, 118 using the database access step 216.
  • the device 106 may capture an image comprising an object 22 positioned on the cover 108 during the object identification step 208 via, for example, the object identification algorithm 808 with access to the databases 74, 80, 1 18 using the database access step 216, the object 22 identifying the patient (e.g., a health card, a driver's license, etc.).
  • Object identification may be similar to the inventions disclosed in U.S. Provisional Patent Application Serial Nos. 62/426,494 filed on November 26, 2016 and 62/426,515 filed on November 26, 2016, the contents of both are incorporated herein by reference.
  • the object 22 comprises object data (not shown) including features and information associated with the object 22.
  • the content presentation step 210 includes the presentation of relevant augmented data 1 14, for example a patient profile (e.g., name, patient identifier, weight, height, last visit, patient history, etc.), on the cover 108 by the presentation device 104 for the user 20 via the presentation algorithm 804 with access to the databases 74, 80, 1 18 using the database access step 216.
  • a patient profile e.g., name, patient identifier, weight, height, last visit, patient history, etc.
  • the user identification step 204 may be required to restrict access to authorized users 20 and/or to associate the user 20 with an encounter with the patient 30.
  • FIG. 6 depicts an illustration of a patient identification workflow 200a with the user 20 (alternately "health care worker 20") in accordance with an embodiment of the present invention.
  • the workflow 200a includes the registration step 202, the user identification step 204, the patient identification step 206, the content presentation step 210, the activity step 212, and the cover disposal step 214.
  • the augmented data 114 includes, for example, a patient profile (e.g., name, patient identifier, weight, height, last visit, patient history, etc.).
  • the user 20 may preferably interact with the augmented data 1 14 (e.g., the patient profile can be scrolled by the user 20 using a finger or an instrument to reveal additional information).
  • FIG. 7 depicts an illustration of a patient intake workflow 200b in accordance with an embodiment of the present invention.
  • the workflow 200b includes the registration step 202, the user identification step 204, the patient identification step 206, the content presentation step 210, the activity step 212, and the cover disposal step 214.
  • the augmented data 1 14 includes, for example, an interface (e.g., a keyboard) that the user 20 may use to input patient information (e.g., name, patient identifier, weight, height, symptoms, etc.).
  • the device 106 may be adapted to communicate (e.g., wifi, bluetooth) with a secondary computing device 1 16 (e.g., tablet, phone, laptop, etc.) as a secondary display or to input patient information.
  • a secondary computing device 1 16 e.g., tablet, phone, laptop, etc.
  • the secondary computing device 1 16 may be preferable for the input or display of confidential information associated with the patient 30 (e.g., medical test results).
  • FIG. 8 depicts an illustration of a test results registration workflow 200c on an object 22 (e.g., an RDT) associated with a patient (not shown) in accordance with an embodiment of the present invention.
  • the workflow 200c includes the registration step 202, the user identification step 204, the object identification step 208, the content presentation step 210, the activity step 212, and the cover disposal step 214.
  • the RDT 22 is positioned on the cover 108 and during step 208, the image capture device 102 automatically reads and identifies the RDT 22 by accessing one or more databases 74, 80, 1 18.
  • the presentation device 104 displays augmented data 1 14 on the cover 108, during step 210, comprising RDT information (e.g., type of RDT, incubation time, catalog number, test result, etc.).
  • RDT information e.g., type of RDT, incubation time, catalog number, test result, etc.
  • the RDT information is preferably automatically associated or registered with a patient (not shown).
  • multiple RDTs can be positioned on the cover 108 and visually categorized by projecting, for example, a specific color around each RDT. For example, specific colors may be used to categorize RDTs based on patients, test results, incubation time, etc.
  • FIG. 9 depicts an illustration of a multiple object workflow 200d involving two or more objects 22 (e.g., RDTs) in accordance with an embodiment of the present invention.
  • the workflow 200d includes the registration step 202, the user identification step 204, the object identification step 208, the content presentation step 210, the activity step 212, and the cover disposal step 214.
  • the RDTs 22 are positioned on the cover 108 and during step 208 the image capture device 102 automatically reads and identifies each RDT 22 by accessing one or more databases 74, 80, 1 18.
  • the presentation device 104 during step 210 displays augmented data 1 14 on the cover 108 comprising patient information (e.g., name, patient identifier, weight, height, etc.) associated with one of the RDTs 22 selected by the user 20 during step 212.
  • patient information e.g., name, patient identifier, weight, height, etc.
  • the presentation device 104 is adapted to color code the patient information to match a color projected around the corresponding RDT 22 during step 210.
  • the device 106 may be adapted to communicate (e.g., wifi, bluetooth) with a secondary computing device 116 (e.g., tablet, phone, laptop, etc.) as a secondary display or to input patient information during the registration step 202.
  • a secondary computing device 116 e.g., tablet, phone, laptop, etc.
  • the secondary computing device 1 16 may be preferable for the input or display of confidential information (e.g., medical test results).
  • the user 20 can alter the position of an RDT 22 on the cover 108 without interfering with any interaction between the RDT 22 and the device 106.
  • the device 106 may be adapted to provide an auditory warning to, for example, sound a chime if the correct RDT is selected or a positive test result is obtained. Conversely, an alarm may sound if an incorrect RDT is selected or a negative test result is obtained.
  • FIG. 10 depicts an illustration of an RDT analysis workflow 200e for analyzing and disposing of an object 22 (e.g., an RDT) in accordance with the present invention.
  • the workflow 200e includes the registration step 202, the user identification step 204, the object identification step 208, the content presentation step 210, the activity step 212, and the cover disposal step 214.
  • a first RDT 22a for a first patient and a second RDT 22b for a second patient are positioned on the cover 108 and associated with the respective patient during step 208.
  • the first and second RDTs 22 may be associated with a single patient (not shown).
  • Information associated with each RDT 22 is presented on the cover 108 to the user 20 by the presentation device 106 during step 210.
  • a third RDT 22c may be discarded in a biohazard disposal 28.
  • the third RDT 22c may contain information associated with the same or different patient as the first and second RDTs 22a,b.
  • the augmented data 1 14 presented for each RDT 22a,b,c during step 210 may preferably include a test result and patient information (e.g., photograph, name, patient identifier, etc.)
  • the device 106 and/or device processor (not shown) automatically records, saves, aggregates, and/or tracks the information received from the RDT 22a,b,c in the database 74, 80, 1 18.
  • FIG. 11 depicts an illustration of a treatment workflow 200f for prescribing a patient with a treatment in accordance with the present invention.
  • the workflow 200f includes the registration step 202, the user identification step 204, the patient identification step 206, the object identification step 208, the content presentation step 210, the activity step 212, and the cover disposal step 214.
  • the content presentation step 210 includes a presentation of information associated with the patient, including patient identification, medical history, prescription drug records, etc.
  • the user 20 is dispensing an object 22 (e.g., the drug quinine) to the patient 30 during step 212.
  • the image capture device 104 captures an image comprising the drug 22 during step 208, identifies the medication and automatically records the dispensation of the drug to the patient's prescription drug history stored in the database 74, 80, 1 18.
  • FIG. 12 depicts an illustration of a patient diagnosis workflow 200g in accordance with the present invention.
  • the workflow 200g includes the registration step 202, the user identification step 204, the patient identification step 206, the object identification step 208, the content presentation step 210, the activity step 212, and the cover disposal step 214.
  • the content presentation step 210 includes a presentation of information associated with the patient, such as patient identification information (e.g., name, patient identifier, weight, height, etc.).
  • patient identification information e.g., name, patient identifier, weight, height, etc.
  • the user 20 can preferably input additional information based on a patient interview, including: symptoms, notes (i.e., "second degree burn"), prescribed treatment, etc.
  • an image of the patient's symptoms e.g., burn, rash, etc. is recorded during step 208.
  • FIG. 13 depicts an illustration of a blood collection workflow 200h in accordance with the present invention.
  • the workflow 200h includes the registration step 202, the user identification step 204, the patient identification step 206, the object identification step 208, the content presentation step 210, the activity step 212, and the cover disposal step 214.
  • the user 20, during step 212 collects blood from the patient 30 over the cover 108.
  • the image capture device 102 can capture an image or a series of images (e.g., video) of the blood collection during step 208.
  • the presentation device 104 is adapted to present content (not shown), such as videos and/or live communication with a second user, to instruct the user 20 on the current procedures for collecting blood.
  • FIG. 14 depicts a plan view of a patient physical workflow 200i in accordance with the present invention.
  • the workflow 200i includes the registration step 202, the user identification step 204, the patient identification step 206, the object identification step 208, the content presentation step 210, the activity step 212, and the cover disposal step 214.
  • a secondary device 116 e.g., a blood pressure monitor
  • the monitor 116 may be positioned on the patient 30 by the user 20.
  • the device 106 may wirelessly communicate with the blood pressure monitor 1 16 to receive the blood pressure data of the patient 30 during step 212.
  • the device 106 may use the image capture device 102 to record an image comprising the blood pressure information from the monitor 1 16 during step 208.
  • the presentation device 104 may be adapted to present the augmented data 114 on the cover 108 which may include patient information including blood pressure information, weight, height, blood pressure over time.
  • FIG. 15 depicts an illustration of a medication registration workflow 200j in accordance with the present invention.
  • the workflow 200f includes the registration step 202, the user identification step 204, the object identification step 208, the content presentation step 210, the activity step 212, and the cover disposal step 214.
  • the user 30 positions an object (e.g., unregistered medication) on the cover 108 and the image capture device 102 captures an image comprising the unregistered medication 22.
  • the user 20 may also preferably input information about the drug into the database 74, 80, 118 using the interface (not shown), including drug name, lot number, expiration date, total number in stock.
  • the registered medication includes the information entered by the user and an image of the medication. The registered medication information may be used in future activities.
  • FIG. 16 depicts an example of augmented data 1 14 that may be projected on to the cover 108 during a workflow 200.
  • the augmented data 114 comprises patient identification 1 14a, patient physiological measures 1 14b (e.g., heart rate, blood pressure), reason for visit 114c, patient medical history 1 14d, RDT test results 114e, and a menu bar 114f for the user to adjust settings.
  • patient physiological measures 1 14b e.g., heart rate, blood pressure
  • reason for visit 114c e.g., heart rate, blood pressure
  • patient medical history 1 14d e.g., RDT test results 114e
  • a menu bar 114f for the user to adjust settings.
  • FIG. 17 depicts an example of batch mode or batch processing according to the present invention. Skilled readers will understand that batch processing is the execution of a series of jobs (e.g., analysis of the RDT test result) without manual intervention.
  • objects 22 a,b,c,d,e,f,g,h,i,j,k,l,m,n,o,p,q,e,s,t,u are positioned on the cover 108.
  • the device 106 automatically reads each RDT 22 a,b,c,d,e,f,g,h,i,j,k,l,m,n,o,p,q,e,s,t,u and presents relevant augmented data 1 14 on the cover 108.
  • the augmented data 114 may be color coded to aid in visual examination.
  • RDTs 22 a,e,r,t may be associated with a negative result.
  • RDTs 22 l,n may be associated with a positive result.
  • RDTs 22c, o,s may be associated with alert conditions.
  • RDTs 22 b,d,f,g,h,I,j,k,m,p,q,u may be associated with RDTs that may not, for example, have been associated with a particular patient. Skilled persons will understand that batch processing of RDT test results can be advantageous in an outbreak situation.
  • a larger display / interaction area is preferably provided at a significantly lower cost than for a large device.
  • the device 106 may present different colors in association with each RDT 22 to indicate a condition (e.g., incubation period and results).
  • Fiducials 44 are preferably used by the software to register the surface.
  • a user 20 may register an RDT using the tablet. User 20 positions RDT 22 with blood sample taken on the cover 108. The device 106 reads all RDTs 22 simultaneously.
  • a secondary device 1 16 preferably protects the confidential information on individual RDTs.
  • the device 116 preferably instructs the user to discard the cover which will also collect RDTs and dispose the package. The user will place a new cover 108 on the surface prior to conducting subsequent activities.
  • the image capture device 102 registers a user's interaction with projected augmented data 1 14.
  • Workflow 200 augmentation is preferably enhanced by recognizing objects 22 placed on the cover 108 prompting a response in the projected augmented data 1 14.
  • a training tool for new health workers e.g., simulation of patients and clinical cases for new health workers.
  • the invention provides: training tools for new clinical protocols of new RDTs (or new medical devices); monitoring tools for patients (e.g., monitoring the patient visit, clinical protocol follow up, collection of demographic data, monitoring tools for clinical testing, RDT testing, etc.); monitoring tools for logistic and health workers management; monitoring stocks and test available; monitoring health workers performance (related to training tools).
  • the administrator processor 72 may conduct analyses and generate reports 76 based on the workflows 200a-j.
  • the processor 72 may be used to interact with the work area subsystem 50.
  • the processor 72 may be used to update the work area processor 1 12 or to facilitate live communication and/or interaction between an administrator (not shown) and the user 20.
  • a preferred embodiment of the present invention provides a system comprising data storage (e.g. databases 1 18 in FIG. 4) that may be used to store all necessary data required for the operation of the system.
  • data storage e.g. databases 1 18 in FIG. 4
  • a person skilled in the relevant art may understand that a "data store” refers to a repository for temporarily or persistently storing and managing collections of data which include not just repositories like databases (a series of bytes that may be managed by a database management system (DBMS)), but also simpler store types such as simple files, emails, etc.
  • a data store in accordance with the present invention may be one or more databases, co-located or distributed geographically.
  • the data being stored may be in any format that may be applicable to the data itself, but may also be in a format that also encapsulates the data quality.
  • various data stores or databases 74, 80, 1 18 may interface with the system of the present invention, preferably including, without limitation, proprietary databases, epidemiologic databases, medical records databases, UN and major/international healthcare institution databases, healthcare and emergency infrastructure databases, education and economic databases, news databases, demographic databases, communication and military infrastructure databases, image databases, and weather, travel, topographic databases.
  • a clinical and healthcare database may preferably contain, among other things, diagnostic and medical data (clinical information), such as, for example, one or more of the following, which may or may not be related to medical events: (a) test results from diagnostic devices equipped with remote data transfer systems and/or global positioning or localization features; (b) information from UN databases and major healthcare international institutions; and/or (c) scenarios and knowledge data.
  • diagnostic and medical data such as, for example, one or more of the following, which may or may not be related to medical events: (a) test results from diagnostic devices equipped with remote data transfer systems and/or global positioning or localization features; (b) information from UN databases and major healthcare international institutions; and/or (c) scenarios and knowledge data.
  • a sociological database may preferably contain, among other things, sociological data (human information), such as, for example, one or more of the following: (a) population information from local and/or international demographic databases; (b) political and/or organization systems in the area and/or from international databases; (c) education and/or economic systems in the area and/or from international databases; and/or (d) information from news and/or newspapers, drawn from the Internet or elsewhere.
  • sociological data human information
  • An infrastructure database may preferably contain, among other things infrastructure data or information, such as, for example, one or more of the following: (a) information concerning healthcare infrastructure; (b) information concerning communication infrastructures; and/or (c) information concerning emergency and/or military infrastructure; all preferably drawn from local and/or international databases.
  • a geophysics database may preferably contain, among other things, geophysics data or information, such as, for example, one or more of the following: (a) weather and/or climatic information from local databases; and/or (b) topographic information from local and/or international databases.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Public Health (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present invention is directed to a system, method and/or computer readable medium for non-invasive workflow augmentation. An image of an object, containing information, is obtained and supplemented with related information from a database to augment the information from the object. The augmented information is displayed to a user in association with the object.

Description

SYSTEM, METHOD AND/OR COMPUTER READABLE MEDIUM FOR NONINVASIVE WORKFLOW AUGMENTATION
RELATED APPLICATIONS
[0001] The present application claims the benefit of the earlier filed United States Patent Provisional Application No. 62/426,492 filed on November 26, 2016.
FIELD OF THE INVENTION
[0002] The present invention relates generally to apparatuses and techniques for projecting data for computing devices and, more particularly, for projecting augmented data associated with an object.
BACKGROUND OF THE INVENTION
[0003] In the field of workflow augmentation, the ability for an inexperienced user to operate a system may be greatly enhanced when data is presented in natural ways and in which little input is required from a user. As such, individuals and businesses seek additional ways to present relevant information in intuitive ways. Many options of user interfaces are available to users. A user interface generally communicates information or data for business, personal, or other purposes thereby allowing users to take advantage of the value of the information. Because technology and information handling needs and requirements vary between different users or applications, user interfaces may also vary regarding what information is handled, how the information is handled, how much information is processed, stored, or communicated, and how quickly and efficiently the information may be processed, stored, or communicated. The variations in user interfaces allow for user interfaces to be general or configured for a specific user or specific use such as health service provisioning, financial transaction processing, enterprise data storage, or global communications. In addition, user interfaces may include a variety of hardware and software components that may be configured to process, store, and communicate information and may include one or more computer systems, data storage systems, and networking systems.
[0004] In conventional user interfaces, namely the traditional computer interfaces, the interactions are managed with a keyboard and mouse. An end user indicates with a mouse which application should receive inputs by activating an application with a mouse click, and then inputs information to the application with the keyboard. Over time, various input and output devices have developed to enhance end user interactions through a variety of user interfaces, however, newer user interface devices generally have modified existing user interface device functionality. One example is the use of a touchscreen display to accept mouse and keyboard inputs with touches to visual information presented at the display, such as a keyboard or an application graphical user interface. Another example is the use of projected user interfaces that are presented at a projection surface and that have interactions tracked with a camera. For instance, a keyboard projected on a desktop surface accepts user inputs to projected keys instead of physical keys. Structured infrared light projected over the desktop aids the camera in detecting inputs at keys by reflected infrared light that results when an input is made at a key. Each of these input and output devices work independently from each other based upon an end user election to interact with an information handling system using a particular input or output device.
[0005] Conventional user interfaces are then invasive by their nature, requiring directed and physical input for an end user which requires some level of knowledge of basic computer operation and a commonality of understanding with regard to language and other imbedded norms, for instance cardinal priority. [0006] Prior art solutions may have previously failed to consider providing a user interface system that allows for minimal structured and physical input and in where the presentation of objects to the system would result in relevant information being displayed to an end user, though it may be desirable to do so. Such relevant information may include the relationships amongst objects, the relationship and meaning of those objects to the user, another person, or a community. For example in the health care industry, it may be desirable to provide a system which is capable of interpreting test results by placing, for example, a rapid diagnostic tests ("RDTs") onto a surface and having such test read in situ and have projected onto those tests a basic and intuitive interpretation of the results, for instance a green border for good results and another color for tests in which it was determined the testing was conducted improperly, and appropriate next steps including possible treatments based on medical records retrieved automatically from the system from a database. Further, it may be desirable in such an industry to minimize physical contact between the end user and the surface or tests to reduce to reduce contact with possibly infected samples or surfaces.
[0007] As a result, there may be a need for, or it may be desirable to provide, one or more systems, methods, computer readable media, and/or cooperating environments that overcomes one or more of the limitations associated with the prior art. It may be advantageous to provide a system, method, and/or computer readable medium that preferably provides for a user interface system facilitating the presentation of relevant data in a natural and/or contextualized way.
[0008] It may be an object of one preferred embodiment according to the invention to recognize presented objects.
[0009] It may be an object of one preferred embodiment according to the invention to minimize the physical contact required in the analysis of an object. [0010] It may be an object of one preferred embodiment according to the invention to present relevant information to an end-user in a non-invasive manner.
[0011] It may be an object of one preferred embodiment according to the invention to batch process a series of objects presented to it.
[0012] It may be an object of one preferred embodiment according to the invention to allow minimally trained individuals to perform a variety of tasks though the projection of a series of aids.
[0013] It may be an object of the present invention to obviate or mitigate one or more disadvantages and/or shortcomings associated with the prior art, to meet or provide for one or more needs and/or advantages, and/or to achieve one or more objects of the invention— one or more of which may preferably be readily appreciable by and/or suggested to those skilled in the art in view of the teachings and/or disclosures hereof.
SUMMARY OF THE INVENTION
[0014] The present disclosure provides a system, method and/or computer-readable medium for presenting augmented data. More specifically, embodiments of the present invention are directed to a system, method, and/or computer-readable medium for generating and displaying augmented data to enhance one or more workflows for a user. The system, method and/or non-transitory computer-readable medium includes an image capture device for capturing an image of the object within an image capture area, the image including object data. Also included is a database comprising supplementary data associated with the object. A work area processor receives the image, accesses the database, and automatically uses the object data and the supplementary data to generate augmented data associated with the object. A presentation device is provided for presenting the augmented data associated with the object by displaying the augmented data on the surface within a presentation area. Thus, according to the invention, the system, method and/or non-transitory computer readable medium is operative to facilitate non-invasive augmentation of the one or more workflows for the user.
[0015] According to an aspect of one preferred embodiment of the invention, the object comprises a face, a label, an RDT, a body part, packaging, and/or a drug.
[0016] According to an aspect of one preferred embodiment of the invention, at least a portion of the presentation area overlaps with the image capture area.
[0017] According to an aspect of one preferred embodiment of the invention, a cover supported by the surface, can be used to facilitate registration of the presentation area and the image capture area in relation to the surface.
[0018] According to an aspect of one preferred embodiment of the invention, the cover provides fiducials to enhance registration.
[0019] According to an aspect of one preferred embodiment of the invention, the presentation device and the image capture device are elevated in relation to the surface.
[0020] According to an aspect of one preferred embodiment of the invention, the augmented data is adapted for user interaction.
[0021] According to an aspect of one preferred embodiment of the invention, the augmented data comprises patient identification, color coding, a user interface, medical information, patient history, training information, and/or drug information.
[0022] Other advantages, features and characteristics of the present invention, as well as methods of operation and functions of the related elements of the system, method and computer readable medium, and the combination of steps, parts and economies of manufacture, will become more apparent upon consideration of the following detailed description and the appended claims with reference to the accompanying drawings, the latter of which are briefly described herein below. BRIEF DESCRIPTION OF THE DRAWINGS
[0023] The novel features which are believed to be characteristic of the system, device and methods according to the present invention, as to their structure, organization, use, and method of operation, together with further objectives and advantages thereof, may be better understood from the following drawings in which presently preferred embodiments of the invention may now be illustrated by way of example. It is expressly understood, however, that the drawings are for the purpose of illustration and description only, and are not intended as a definition of the limits of the invention. In the accompanying drawings:
[0024] FIG. 1 is an illustration of an embodiment of the present invention;
[0025] FIG. 2 is an illustration of a yet another embodiment of the present invention;
[0026] FIG. 3 is an illustration of a yet another embodiment of the present invention;
[0027] FIG. 4 is an illustration of a yet another embodiment of the present invention;
[0028] FIG. 5 is a flow chart of an embodiment of the present invention;
[0029] FIG. 6 is an illustration of a yet another embodiment of the present invention;
[0030] FIG. 7 is an illustration of a yet another embodiment of the present invention;
[0031] FIG. 8 is an illustration of a yet another embodiment of the present invention;
[0032] FIG. 9 is an illustration of a yet another embodiment of the present invention;
[0033] FIG. 10 is an illustration of a yet another embodiment of the present invention;
[0034] FIG. 11 is an illustration of a yet another embodiment of the present invention;
[0035] FIG. 12 is an illustration of a yet another embodiment of the present invention; [0036] FIG. 13 is an illustration of a yet another embodiment of the present invention;
[0037] FIG. 14 is an illustration of a yet another embodiment of the present invention;
[0038] FIG. 15 is an illustration of a yet another embodiment of the present invention;
[0039] FIG. 16 is an illustration of a yet another embodiment of the present invention;
[0040] FIG. 17 is an illustration of a yet another embodiment of the present invention; and
[0041] FIG. 18 is an illustration of a yet another embodiment of the present invention.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
[0042] The description that follows, and the embodiments described therein, may be provided by way of illustration of an example, or examples, of particular embodiments of the principles of the present invention. These examples are provided for the purposes of explanation, and not of limitation, of those principles and of the invention. In the description, like parts are marked throughout the specification and the drawings with the same respective reference numerals. The drawings are not necessarily to scale and in some instances proportions may have been exaggerated in order to more clearly depict certain embodiments and features of the invention.
[0043] The present disclosure may be now described in terms of an exemplary system in which the present disclosure, in various embodiments, would be implemented. This may be for convenience only and may be not intended to limit the application of the present disclosure. It may be apparent to one skilled in the relevant art(s) how to implement the present disclosure in alternative embodiments. [0044] In this disclosure, a number of terms and abbreviations may be used. The following definitions and descriptions of such terms and abbreviations are provided in greater detail.
[0045] As used herein, a person skilled in the relevant art may generally understand the term "comprising" to generally mean the presence of the stated features, integers, steps, or components as referred to in the claims, but that it does not preclude the presence or addition of one or more other features, integers, steps, components or groups thereof.
[0046] It should also be appreciated that the present invention can be implemented in numerous ways, including as a process, method, an apparatus, a system, a device, a method, or a computer readable medium such as a computer readable storage medium or a computer network wherein program instructions are sent over a network (e.g. optical or electronic communication links). In this specification, these implementations, or any other form that the invention may take, may be referred to as processes. In general, the order of the steps of the disclosed processes may be altered within the scope of the invention.
[0047] Reference throughout this specification to "one embodiment," "an embodiment," or similar language means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, appearances of the phrases "in one embodiment," "in an embodiment," and similar language throughout this specification may, but do not necessarily, all refer to the same embodiment.
[0048] Moreover, the described features, structures, or characteristics of the invention may be combined in any suitable manner in one or more embodiments. It will be apparent to those skilled in the art that various modifications and variations can be made to the present invention without departing from the spirit and scope of the invention. Thus, it is intended that the present invention cover the modifications and variations of this invention provided they come within the scope of the appended claims and their equivalents. Reference will now be made in detail to the preferred embodiments of the invention.
[0049] Preferred embodiments of the present invention can be implemented in numerous configurations depending on implementation choices based upon the principles described herein. Various specific aspects are disclosed, which are illustrative embodiments not to be construed as limiting the scope of the disclosure. One aspect of the disclosure is a method, computer program product, apparatus, and system for the exchange of data between two or more cloud systems. Although the present specification describes components and functions implemented in the embodiments with reference to standards and protocols known to a person skilled in the art, the present disclosures as well as the embodiments of the present invention are not limited to any specific standard or protocol. Each of the standards for non- mobile and mobile computing, including the Internet and other forms of computer network transmission (e.g., TCP/IP, UDP/IP, HTML, and HTTP) represent examples of the state of the art. Such standards are periodically superseded by faster or more efficient equivalents having essentially the same functions. Accordingly, replacement standards and protocols having the same functions are considered equivalents.
[0050] As those of ordinary skill in the art would generally understand, the Internet is a global computer network which comprises a vast number of computers and computer networks which are interconnected through communication links. A person skilled in the relevant art may understand that an electronic communications network of the present invention, may include, but is not limited to, one or more of the following: a local area network, a wide area network, peer to peer communication, an intranet, or the Internet. The interconnected computers exchange information using various services, including, but not limited to, electronic mail, Gopher, web-services, application programming interface (API), File Transfer Protocol (FTP). This network allows a server computer system (a Web server) to send graphical Web pages of information to a remote client computer system. The remote client computer system can then display the Web pages via its web browser. Each Web page (or link) of the "world wide web" ("WWW") is uniquely identifiable by a Uniform Resource Locator (URL). To view a specific Web page, a client computer system specifies the URL for that Web page in a request (e.g., a Hypert ext Transfer Protocol ("HTTP") request). The request is forwarded to the Web server that supports the Web page. When the Web server receives the request, it sends the Web page to the client computer system. When the client computer system receives the Web page, it typically displays the Web page using a browser. A web browser or a browser is a special-purpose application program that effects the requesting of web pages and the displaying of web pages and the use of web-based applications. Commercially available browsers include Microsoft Internet Explorer and Firefox, Google Chrome among others. It may be understood that with embodiments of the present invention, any browser would be suitable.
[0051] Web pages are typically defined using HTML. HTML provides a standard set of tags that define how a Web page is to be displayed. When a provider indicates to the browser to display a Web page, the browser sends a request to the server computer system to transfer to the client computer system an HTML document that defines the Web page. When the requested HTML document is received by the client computer system, the browser displays the Web page as defined by the HTML document. The HTML document contains various tags that control the displaying of text, graphics, controls, and other features. The HTML document may contain URLs of other Web pages available on that server computer system or other server computer systems. [0052] A person skilled in the relevant art may generally understand a web-based application refers to any program that is accessed over a network connection using HTTP, rather than existing within a device's memory. Web-based applications often run inside a web browser or web portal. Web-based applications also may be client-based, where a small part of the program is downloaded to a user's desktop, but processing is done over the Internet on an external server. Web-based applications may also be dedicated programs installed on an internet-ready device, such as a smart phone or tablet. A person skilled in the relevant art may understand that a web site may also act as a web portal. A web portal may be a web site that provides a variety of services to users via a collection of web sites or web based applications. A portal is most often one specially designed site or application that brings information together from diverse sources in a uniform way. Usually, each information source gets its dedicated area on the page for displaying information (a portlet); often, the user can configure which ones to display. Portals typically provide an opportunity for users to input information into a system. Variants of portals include "dashboards". The extent to which content is displayed in a "uniform way" may depend on the intended user and the intended purpose, as well as the diversity of the content. Very often design emphasis is on a certain "metaphor" for configuring and customizing the presentation of the content and the chosen implementation framework and/or code libraries. In addition, the role of the user in an organization may determine which content can be added to the portal or deleted from the portal configuration.
[0053] It may be generally understood by a person skilled in the relevant art that the term "mobile device" or "portable device" refers to any portable electronic device that can be used to access a computer network such as, for example, the internet. Typically a portable electronic device comprises a display screen, at least one input/output device, a processor, memory, a power module and a tactile man-machine interface as well as other components that are common to portable electronic devices individuals or members carry with them on a daily basis. Examples of portable devices suitable for use with the present invention include, but are not limited to, smart phones, cell phones, wireless data/email devices, tablets, PDAs and MP3 players, test devices, etc.
[0054] It may be generally understood by a person skilled in the relevant art that the term "network ready device" or "internet ready device" refers to devices that are capable of connecting to and accessing a computer network, such as, for example, the Internet, including but not limited to an IoT device. A network ready device may assess the computer network through well-known methods, including, for example, a web-browser. Examples of internet- ready devices include, but are not limited to, mobile devices (including smart-phones, tablets, PDAs, etc.), gaming consoles, and smart-TVs. It may be understood by a person skilled in the relevant art that embodiment of the present invention may be expanded to include applications for use on a network ready device (e.g. cellphone). In a preferred embodiment, the network ready device version of the applicable software may have a similar look and feel as a browser version but that may be optimized to the device. It may be understood that other "smart" devices (devices that are capable of connecting to and accessing a computer network, such as, for example, the internet) such as medical or test devices, including but not limited to smart blood pressure monitors, smart glucometers, IoT devices, etc.
[0055] It may be further generally understood by a person skilled in the relevant art that the term "downloading" refers to receiving datum or data to a local system (e.g. mobile device) from a remote system (e.g. a client) or to initiate such a datum or data transfer. Examples of a remote systems or clients from which a download might be performed include, but are not limited to, web servers, FTP servers, email servers, or other similar systems. A download can mean either any file that may be offered for downloading or that has been downloaded, or the process of receiving such a file. A person skilled in the relevant art may understand the inverse operation, namely sending of data from a local system (e.g. mobile device) to a remote system (e.g. a database) may be referred to as "uploading". The data and/or information used according to the present invention may be updated constantly, hourly, daily, weekly, monthly, yearly, etc. depending on the type of data and/or the level of importance inherent in, and/or assigned to, each type of data. Some of the data may preferably be downloaded from the Internet, by satellite networks or other wired or wireless networks.
[0056] Elements of the present invention may be implemented with computer systems which are well known in the art. Generally speaking, computers include a central processor, system memory, and a system bus that couples various system components including the system memory to the central processor. A system bus may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures. The structure of a system memory may be well known to those skilled in the art and may include a basic input/output system ("BIOS") stored in a read only memory ("ROM") and one or more program modules such as operating systems, application programs and program data stored in random access memory ("RAM"). Computers may also include a variety of interface units and drives for reading and writing data. A user of the system can interact with the computer using a variety of input devices, all of which are known to a person skilled in the relevant art.
[0057] One skilled in the relevant art would appreciate that the device connections mentioned herein are for illustration purposes only and that any number of possible configurations and selection of peripheral devices could be coupled to the computer system. [0058] Computers can operate in a networked environment using logical connections to one or more remote computers or other devices, such as a server, a router, a network personal computer, a peer device or other common network node, a wireless telephone or wireless personal digital assistant. The computer of the present invention may include a network interface that couples the system bus to a local area network ("LAN"). Networking environments are commonplace in offices, enterprise-wide computer networks and home computer systems. A wide area network ("WAN"), such as the Internet, can also be accessed by the computer or mobile device.
[0059] It may be appreciated that the type of connections contemplated herein are exemplary and other ways of establishing a communications link between computers may be used in accordance with the present invention, including, for example, mobile devices and networks. The existence of any of various well-known protocols, such as TCP/IP, Frame Relay, Ethernet, FTP, HTTP and the like, may be presumed, and computer can be operated in a client-server configuration to permit a user to retrieve and send data to and from a web- based server. Furthermore, any of various conventional web browsers can be used to display and manipulate data in association with a web based application.
[0060] The operation of the network ready device (i.e., a mobile device) may be controlled by a variety of different program modules, engines, etc. Examples of program modules are routines, algorithms, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. It may be understood that the present invention may also be practiced with other computer system configurations, including multiprocessor systems, microprocessor-based or programmable consumer electronics, network PCS, personal computers, minicomputers, mainframe computers, and the like. Furthermore, the invention may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote memory storage devices.
[0061] Embodiments of the present invention may include a computer system. It may be understood by a person skilled in the relevant art that the computer system can be a personal computer, mobile device, notebook computer, server computer, mainframe, networked computer (e.g., router), workstation, and the like. In one embodiment, the computer system includes a processor coupled to a bus and memory storage coupled to the bus. The memory storage can be volatile or non-volatile (i.e. transitory or non-transitory) and can include removable storage media. The computer can also include a display, provision for data input and output, etc. as may be understood by a person skilled in the relevant art.
[0062] Some portion of the detailed descriptions that follow are presented in terms of procedures, steps, logic block, processing, and other symbolic representations of operations on data bits that can be performed on computer memory. These descriptions and representations are the means used by those skilled in the data processing arts to most effectively convey the substance of their work to others skilled in the art. A procedure, computer executed step, logic block, process, etc. is here, and generally, conceived to be a self-consistent sequence of operations or instructions leading to a desired result. The operations are those requiring physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated in a computer system. It has proven convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers or the like. [0063] FIG. 18 illustrates a more detailed diagram of an example computing device 400 within which a set of instructions, for causing the computing device to perform any one or more of the methods discussed herein, may be executed. The computing device 400 may include additional or different components, some of which may be optional and not necessary to provide aspects of the present disclosure. The computing device may be connected to other computing device in a LAN, an intranet, an extranet, or the Internet. The computing device may operate in the capacity of a server or a client computing device in client-server network environment, or as a peer computing device in a peer-to-peer (or distributed) network environment. The computing device may be a provided by a personal computer (PC), a tablet PC, a set-top box (STB), a Personal Digital Assistant (PDA), a cellular telephone, or any computing device capable of executing a set of instructions (sequential or otherwise) that specify operations to be performed by that computing device. Further, while only a single computing device is illustrated, the term "computing device" shall also be taken to include any collection of computing devices that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein.
[0064] Exemplary computing device 400 includes a processor 402, a main memory 404 (e.g., read-only memory (ROM) or dynamic random access memory (DRAM)), and a data storage devices 414, which communicate with each other via a bus 426.
[0065] Processor 402 may be represented by one or more general-purpose processing devices such as a microprocessor, central processing unit, or the like. More particularly, processor 402 may be a complex instruction set computing (CISC) microprocessor, reduced instruction set computing (RISC) microprocessor, very long instruction word (VLIW) microprocessor, or a processor implementing other instruction sets or processors implementing a combination of instruction sets. Processor 402 may also be one or more special-purpose processing devices such as an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), a digital signal processor (DSP), network processor, or the like. Processor 402 is configured to execute instructions 424 for performing the operations and functions discussed herein.
[0066] Computing device 400 may further include a network interface device 406, image capture devices (e.g. video cameras, I cameras, depth-sensing cameras, or multi camera based systems) 410, a video display unit (e.g. an LCD screen or projection device) 420, a character input device 418 (e.g., a keyboard), and a touch screen input device 416.
[0067] Data storage device 414 may include a computer-readable storage medium 412 on which is stored one or more sets of instructions 424 embodying any one or more of the methodologies or functions described herein. Instructions 424 may also reside, completely or at least partially, within main memory 404 and/or within processor 402 during execution thereof by computing device 400, main memory 404 and processor 402 also constituting computer-readable storage media. Instructions 424 may further be transmitted or received over network 408 via network interface device 406.
[0068] A preferred embodiment of the present invention provides an information handling system to create end user immersed environment by managing the presentation of information and acceptance of inputs through a user interface engine, such as instructions executing in one or more of CPU, chipset and individual input and output components like those described below. A display interfaces with a graphics system to receive pixel information that is presented as visual images. Displays may be configured as an integrated device, such as in a tablet, laptop or convertible information handling system, or as a peripheral device coupled through a cable or wireless interface. A Projector is, for example, essentially a display device that projects pixel information against a projection surface, such as a desktop surface. An infrared emitter projects infrared light to aid in touch detection and/or resolution of three dimensional images captured by a camera. For example, an infrared emitter projects an I curtain just above a surface to detect touches at the surface that pass through the IR curtain. As another example, an infrared emitter may projected structured light that does not disrupt the end user but provides camera with depth information that aids analysis of images to detect body parts, such as fingers and hands, and dimensions. User interface engine manages inputs and outputs to other types of devices, including a mouse, keyboard, camera and touchscreen. User interface engine applies images captured by camera with and without enhancements by infrared emitter to manage passive input devices, such as totems that an end user manipulates to indicate an input. User interface engine tracks input and output devices relative to applications, windows and content data with a user interface table that allows rapid transitions of presented content between available output devices.
[0069] In accordance with a preferred aspect of the present invention, a person skilled in the relevant art would generally understand the term "application" or "application software" to refer to a program or group of programs designed for end users. While there are system software, typically but not limited to, lower level programs (e.g. interact with computers at a basic level), application software resides above system software and may include, but is not limited to database programs, word processors, spreadsheets, etc. Application software may be grouped along with system software or published alone. Application software may simply be referred to as an "application".
[0070] It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise as apparent from the following discussions, it is appreciated that throughout the present invention, discussions utilizing terms such as "receiving", "creating", "providing", "communicating" or the like refer to the actions and processes of a computer system, or similar electronic computing device, including an embedded system, that manipulates and transfers data represented as physical (electronic) quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices. As used herein, reference to the "transmission", "processing", "interpretation" or the like of data associated with a cloud may refer to advancing through logic contained in the guideline. This may be accomplished, among other methods, by running on a processor one or more computer programs representative of the algorithms, processes, etc.
[0071] According to the invention, one or more non- invasive workflow augmentation systems, methods, computer-readable media, and/or cooperating environments may be disclosed.
[0072] The invention is contemplated for use in association with one or more cooperating environments, to afford increased functionality and/or advantageous utilities in association with same. The invention, however, is not so limited.
[0073] Certain novel features which are believed to be characteristic of a non-invasive workflow augmentation system, method, computer readable medium, and/or certain features of the system, method, computer readable medium which are novel in conjunction with the cooperating environment, according to the present invention, as to their organization, use, and/or method of operation, together with further objectives and/or advantages thereof, may be better understood from the accompanying disclosure in which presently preferred embodiments of the invention are disclosed by way of example. It is expressly understood, however, that the accompanying disclosure is for the purpose of illustration and/or description only, and is not intended as a definition of the limits of the invention. [0074] Naturally, in view of the teachings and disclosures herein, persons having ordinary skill in the art may appreciate that alternate designs and/or embodiments of the invention may be possible (e.g., with substitution of one or more steps, algorithms, processes, features, structures, parts, components, modules, utilities, etc. for others, with alternate relations and/or configurations of steps, algorithms, processes, features, structures, parts, components, modules, utilities, etc).
[0075] Although some of the steps, algorithms, processes, features, structures, parts, components, modules, utilities, relations, configurations, etc. according to the invention are not specifically referenced in association with one another, they may be used, and/or adapted for use, in association therewith.
[0076] One or more of the disclosed steps, algorithms, processes, features, structures, parts, components, modules, utilities, relations, configurations, and the like may be implemented in and/or by the invention, on their own, and/or without reference, regard or likewise implementation of one or more of the other disclosed steps, algorithms, processes, features, structures, parts, components, modules, utilities, relations, configurations, and the like, in various permutations and combinations, as may be readily apparent to those skilled in the art, without departing from the pith, marrow, and spirit of the disclosed invention.
[0077] In certain implementations, instructions 424 may include instructions for noninvasive workflow augmentation systems. While computer-readable storage medium 412 is shown in the example of FIG. 18 to be a single medium, the term "computer-readable storage medium" should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions. The term "computer-readable storage medium" shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present disclosure. The term "computer-readable storage medium" shall accordingly be taken to include, but not be limited to, solid-state memories, optical media, and magnetic media.
[0078] The methods, components, and features described herein may be implemented by discrete hardware components or may be integrated in the functionality of other hardware components such as ASICS, FPGAs, DSPs or similar devices. In addition, the methods, components, and features may be implemented by firmware modules or functional circuitry within hardware devices. Further, the methods, components, and features may be implemented in any combination of hardware devices and software components, or only in software.
[0079] In the foregoing description, numerous details are set forth. It will be apparent, however, to one of ordinary skill in the art having the benefit of this disclosure, that the present disclosure may be practiced without these specific details. In some instances, well- known structures and devices are shown in block diagram form, rather than in detail, in order to avoid obscuring the present disclosure.
[0080] The present disclosure also relates to an apparatus for performing the operations herein. This apparatus may be specially constructed for the required purposes, or it may comprise a general purpose computer selectively activated or reconfigured by a computer program stored in the computer. Such a computer program may be stored in a computer readable storage medium, such as, but not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, and magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs), EPROMs, EEPROMs, magnetic or optical cards, or any type of media suitable for storing electronic instructions.
[0081] It is to be understood that the above description is intended to be illustrative, and not restrictive. Various other implementations will be apparent to those of skill in the art upon reading and understanding the above description. The scope of the disclosure should, therefore, be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled.
[0082] In a preferred embodiment, the present invention is adapted to provide noninvasive workflow augmentation. In particular, the present invention is preferably adapted to provide non-invasive workflow augmentation in the delivery of healthcare services. According to the present invention, an interface is preferably presented on a surface for interaction with a user and/or an object.
[0083] According to a preferred embodiment of the present invention, and as best shown in FIGS. 1 and 2, there is provided a presentation device 104 (e.g., a projector, laser, or beamer for projecting both visible and/or invisible light which may include infrared light) adapted to present content to a user 20 (e.g., a user interface, identification information., test results, etc.) within a presentation area 40a,b and an image capture device 102 (e.g., a camera) adapted to capture an image from an image capture area 42a,b. In some embodiments, the presentation device 104 and image capture device 102 are provided as a combined device 106. Preferably, the presentation device 104 and image capture device 102 are positioned in a similar direction - i.e., at least a portion of the presentation area 40a,b overlaps with the image capture area 42a,b. In preferable embodiments, the presentation device 104 and the image capture device 102 are associated with optical axes (40a,b and 42a,b respectively) that are close to each other to provide a significant area of mutual coverage or overlap. In some embodiments, the user interface comprises a menu bar (not shown) with various interactive options for the user 20 including, but not limited to, patient data, device settings, help, etc.
[0084] The presentation device 104 and the image capture device 102 are preferably registered and/or calibrated to one other (e.g., to optimally align and/or position the respective optical axes 40a,b and 42a,b) prior to use by the user (e.g., during the device 106 manufacturing process). As shown in FIGS. 1 and 2 and in accordance with the present invention, prior to use by the user 20, the device 106 registers and/or calibrates the optical axes 40a,b and 42a,b relative to a cover 108 supported by a surface 10 (e.g., a table, a floor, etc.). In preferable embodiments, the cover 108 provides one or more fiducials 44 (e.g., a distinct symbol, design, grid, etc.) to serve as a marker in the field of view of the image capture device 102 - the markers appearing in the image captured by the device 102 - for use as a point of reference or a unit of measure. In some embodiments, the presentation device 104 preferably serves as a light source for the image capture device 102.
[0085] According to a preferred embodiment of the present invention, as shown in FIG. 2, the presentation device 104 and the image capture device 102 are positioned above the cover 108 via a support 110 that may be fixed (e.g., clamped to the surface, affixed to a vertical support secured to the surface, overhanging the cover, etc.) or flexible (e.g., gooseneck, hinged vertical support, affixed to a vertical support unsecured to the surface, etc.). Persons skilled in the art will appreciate that while a number of supports may be used to position the device, the support 110 preferably positions the device 106 so that it is elevated relative to the cover 108. The device processor 1 12 is associated with the device 106 and in preferable embodiments is combined with the device 106.
[0086] In preferable embodiments, and as shown in FIG. 2, a workflow area 52 is provided by the user 20 and is the area 52 wherein the workflows are conducted. In accordance with the present invention, the device 106 is non-intrusive with respect to the workflows. The device 106 preferably merges with the desired workflows of the user 20 with minimal disruption. Minimum training and modification of workflows is required.
[0087] As shown in FIG. 3, the system 100 in accordance with the present invention includes the work area subsystem 50, an administrator subsystem 70, and databases 80a,b,c (collectively "databases 80"). The work area subsystem 50, for use by the user 20 and/or a patient 30, includes the image capture device 102 and the presentation device 104 (collectively, the device 106) for use with the objects 22. The device 106 is preferably elevated by the support 1 10 relative to the cover 108 supported by the surface 10. The presentation device 104 presents content 114 (alternately "augmented data 1 14") on the cover 108. The device processor 112, a secondary device 1 16 (e.g., tablet, laptop, smartphone, etc.) and work area database 118 are also provided in the work area 50.
[0088] The administrator subsystem 70 includes an administrator processor 72, an administrator database 74 and an administrator report 76.
[0089] In FIG. 3, the system 100 is shown in use with one or more communication networks 100. The communication networks 100 may include satellite networks (e.g., GPS networks), terrestrial wireless networks, and the Internet. Persons having ordinary skill in the art will appreciate that the system 100 includes hardware and software.
[0090] FIG. 4 schematically illustrates, among other things, that the work area subsystem 50 includes the work area processor 1 12, input-output devices 120 including the presentation device 104, the image capture device 102 and a speaker 123 for providing the user with audible information. Also included is the work area database 118, the secondary device 1 16, and a work area receiver-transmitter device 122. A computer readable medium (e.g., a work area processor-readable memory) 130 which stores one or more algorithms 801 , 802, 803, 804 and/or 808 operative ly encoded for use by the work area processor 1 12. The algorithms 801 , 802, 803, 804 and/or 808 provide the processor 1 12 with registration logic 801 , recognition logic 802, image capture logic 803, presentation logic 804, and/or object identification logic 808.
[0091] The administrator subsystem 70 includes the administrator processor 72, the administrator database 74, an input-output device 75, and an administrator receiver- transmitter 76. A computer readable medium (e.g., an administrator processor-readable memory) 78 which stores one or more algorithms 805, 806 and/or operatively encoded for use by the administrator processor 72. The algorithms 805, 806 and/or 807 provide the processor 72 with analysis logic 805, report generation logic 806 and/or work area control logic 307.
[0092] In accordance with the present invention, and as shown in FIGS. 1 to 3, the device 106 preferably detects and registers (and/or calibrates with respect to) the cover 108 prior to performing an action. In preferable embodiments, the device 106 presents (e.g., by projection) one or more patterns 1 14 on the cover 108 to carry out the registration and/or calibration process. The device 106 can preferably tolerate movements during use which may be caused, for example, by a user or patient hitting the surface. In preferable embodiments, the device 106 utilizes fiducials 44 displayed at predetermined coordinates on the cover 108 for registration.
[0093] In operation, the device 106 preferably augments objects 22 positioned on (and/or associated with) the cover 108 with augmented data 1 14 that is projected from the presentation device 104. In accordance with the present invention, the device 106 augments one or more objects 22 placed on the cover 108 with various augmented data 114 including, but not limited to, descriptive text and graphics, notices, warnings and instructions. The device 106 preferably provides the user 20 with a natural interface and may be especially advantageous for users 20 with minimal educational background, healthcare training and/or low digital literacy levels. In accordance with the present invention, augmented data 114 is preferably generated using the object data (not shown) and the supplementary data (not shown) associated the object from the one or more databases.
[0094] In accordance with an embodiment of the present invention, the device 106 interacts with the user 20 by presenting augmented data 114 on the cover 108 that is interactive (e.g., buttons and/or menus). In preferable embodiments, the device 106 engages the user 20 using a natural interface (e.g., hand gestures). In another example, the user 20 may be prompted to interact with projected augmented data 114, such as a projected button and/or select an item from a projected menu. In some embodiments, the user 20 is prompted to provide the device 106 with inputted information by, for example, writing on the cover 108 (e.g., via a finger, an instrument such as a pen, etc.).
[0095] According to a preferred embodiment of the invention, the user 20 is not required to excessively physically handle a particular object and may dispose of the cover 108 after conducting an activity in order to promote hygiene and reduce the likelihood of contaminating the site, patient, user, and/or objects (e.g., RDTs). Advantageously, preferred embodiments of the present invention reduce the frequency of conducting decontamination procedures for physical objects (e.g., a reader, a keyboard, a notebook, etc.).
[0096] In accordance with the present invention, and as depicted in FIGS. 5 to 15, preferred embodiments promote non-invasive workflow augmentation for users. The user 20 preferably performs traditional activities that the device 106 preferably augments (e.g., by observing, recording, and/or providing guidance to the user).
[0097] FIG. 5 depicts a workflow or method 200 which includes the following steps: a registration step 202, a user identification step 204, a patient identification step 206, an object identification step 208, a content presentation step 210, an activity step 212, a cover disposal step 214 and/or a step 216 of accessing (including uploading and downloading) cover registration data, user identification data, patient identification data, object identification data, content presentation data, and/or activity data from the databases 74, 80, 1 18.
[0098] With reference to FIGS. 5 to 15, in preferable embodiments, the registration step 202 may include the registration of the cover 108 using the fiducials 44 and the establishment of a wireless connection with the secondary device 1 16. The user identification step 204 includes the use of the device 106 to identify the user 20 via, for example, the facial recognition algorithm 802 with access to databases 74, 80, 1 18 using the database access step 216. The patient identification step 206 includes the use of the device 106 to identify the patient 30 via, for example, the facial recognition algorithm 802 with access to the databases 74, 80, 118 using the database access step 216. Alternatively, the device 106 may capture an image comprising an object 22 positioned on the cover 108 during the object identification step 208 via, for example, the object identification algorithm 808 with access to the databases 74, 80, 1 18 using the database access step 216, the object 22 identifying the patient (e.g., a health card, a driver's license, etc.). Object identification may be similar to the inventions disclosed in U.S. Provisional Patent Application Serial Nos. 62/426,494 filed on November 26, 2016 and 62/426,515 filed on November 26, 2016, the contents of both are incorporated herein by reference. In accordance with the present invention, the object 22 comprises object data (not shown) including features and information associated with the object 22. The content presentation step 210 includes the presentation of relevant augmented data 1 14, for example a patient profile (e.g., name, patient identifier, weight, height, last visit, patient history, etc.), on the cover 108 by the presentation device 104 for the user 20 via the presentation algorithm 804 with access to the databases 74, 80, 1 18 using the database access step 216.
[0099] Skilled readers will appreciate that the user identification step 204 may be required to restrict access to authorized users 20 and/or to associate the user 20 with an encounter with the patient 30.
[00100] FIG. 6 depicts an illustration of a patient identification workflow 200a with the user 20 (alternately "health care worker 20") in accordance with an embodiment of the present invention. The workflow 200a includes the registration step 202, the user identification step 204, the patient identification step 206, the content presentation step 210, the activity step 212, and the cover disposal step 214. In the present workflow 200a, the augmented data 114 includes, for example, a patient profile (e.g., name, patient identifier, weight, height, last visit, patient history, etc.). In the activity step 212, the user 20 may preferably interact with the augmented data 1 14 (e.g., the patient profile can be scrolled by the user 20 using a finger or an instrument to reveal additional information).
[00101] FIG. 7 depicts an illustration of a patient intake workflow 200b in accordance with an embodiment of the present invention. The workflow 200b includes the registration step 202, the user identification step 204, the patient identification step 206, the content presentation step 210, the activity step 212, and the cover disposal step 214. In workflow 200b, the augmented data 1 14 includes, for example, an interface (e.g., a keyboard) that the user 20 may use to input patient information (e.g., name, patient identifier, weight, height, symptoms, etc.). Alternatively, the device 106 may be adapted to communicate (e.g., wifi, bluetooth) with a secondary computing device 1 16 (e.g., tablet, phone, laptop, etc.) as a secondary display or to input patient information. The secondary computing device 1 16 may be preferable for the input or display of confidential information associated with the patient 30 (e.g., medical test results).
[00102] FIG. 8 depicts an illustration of a test results registration workflow 200c on an object 22 (e.g., an RDT) associated with a patient (not shown) in accordance with an embodiment of the present invention. The workflow 200c includes the registration step 202, the user identification step 204, the object identification step 208, the content presentation step 210, the activity step 212, and the cover disposal step 214. The RDT 22 is positioned on the cover 108 and during step 208, the image capture device 102 automatically reads and identifies the RDT 22 by accessing one or more databases 74, 80, 1 18. The presentation device 104 displays augmented data 1 14 on the cover 108, during step 210, comprising RDT information (e.g., type of RDT, incubation time, catalog number, test result, etc.). The RDT information is preferably automatically associated or registered with a patient (not shown). In some preferable embodiments, multiple RDTs (not shown) can be positioned on the cover 108 and visually categorized by projecting, for example, a specific color around each RDT. For example, specific colors may be used to categorize RDTs based on patients, test results, incubation time, etc.
[00103] FIG. 9 depicts an illustration of a multiple object workflow 200d involving two or more objects 22 (e.g., RDTs) in accordance with an embodiment of the present invention. The workflow 200d includes the registration step 202, the user identification step 204, the object identification step 208, the content presentation step 210, the activity step 212, and the cover disposal step 214. The RDTs 22 are positioned on the cover 108 and during step 208 the image capture device 102 automatically reads and identifies each RDT 22 by accessing one or more databases 74, 80, 1 18. The presentation device 104 during step 210 displays augmented data 1 14 on the cover 108 comprising patient information (e.g., name, patient identifier, weight, height, etc.) associated with one of the RDTs 22 selected by the user 20 during step 212. In preferable embodiments, the presentation device 104 is adapted to color code the patient information to match a color projected around the corresponding RDT 22 during step 210. Similar to the foregoing, the device 106 may be adapted to communicate (e.g., wifi, bluetooth) with a secondary computing device 116 (e.g., tablet, phone, laptop, etc.) as a secondary display or to input patient information during the registration step 202. The secondary computing device 1 16 may be preferable for the input or display of confidential information (e.g., medical test results). In a preferred embodiment, the user 20 can alter the position of an RDT 22 on the cover 108 without interfering with any interaction between the RDT 22 and the device 106. In yet another embodiment, the device 106 may be adapted to provide an auditory warning to, for example, sound a chime if the correct RDT is selected or a positive test result is obtained. Conversely, an alarm may sound if an incorrect RDT is selected or a negative test result is obtained.
[00104] FIG. 10 depicts an illustration of an RDT analysis workflow 200e for analyzing and disposing of an object 22 (e.g., an RDT) in accordance with the present invention. The workflow 200e includes the registration step 202, the user identification step 204, the object identification step 208, the content presentation step 210, the activity step 212, and the cover disposal step 214. A first RDT 22a for a first patient and a second RDT 22b for a second patient are positioned on the cover 108 and associated with the respective patient during step 208. In some embodiments of the invention, the first and second RDTs 22 may be associated with a single patient (not shown). Information associated with each RDT 22 is presented on the cover 108 to the user 20 by the presentation device 106 during step 210. Once the analysis is complete, a third RDT 22c may be discarded in a biohazard disposal 28. The third RDT 22c may contain information associated with the same or different patient as the first and second RDTs 22a,b. The augmented data 1 14 presented for each RDT 22a,b,c during step 210 may preferably include a test result and patient information (e.g., photograph, name, patient identifier, etc.) In preferable embodiments, the device 106 and/or device processor (not shown) automatically records, saves, aggregates, and/or tracks the information received from the RDT 22a,b,c in the database 74, 80, 1 18.
[00105] FIG. 11 depicts an illustration of a treatment workflow 200f for prescribing a patient with a treatment in accordance with the present invention. The workflow 200f includes the registration step 202, the user identification step 204, the patient identification step 206, the object identification step 208, the content presentation step 210, the activity step 212, and the cover disposal step 214. Based on the results of step 204, the content presentation step 210 includes a presentation of information associated with the patient, including patient identification, medical history, prescription drug records, etc. In the present example, the user 20 is dispensing an object 22 (e.g., the drug quinine) to the patient 30 during step 212. The image capture device 104 captures an image comprising the drug 22 during step 208, identifies the medication and automatically records the dispensation of the drug to the patient's prescription drug history stored in the database 74, 80, 1 18.
[00106] FIG. 12 depicts an illustration of a patient diagnosis workflow 200g in accordance with the present invention. The workflow 200g includes the registration step 202, the user identification step 204, the patient identification step 206, the object identification step 208, the content presentation step 210, the activity step 212, and the cover disposal step 214. Based on the results of step 204, the content presentation step 210 includes a presentation of information associated with the patient, such as patient identification information (e.g., name, patient identifier, weight, height, etc.). In addition, during step 212, the user 20 can preferably input additional information based on a patient interview, including: symptoms, notes (i.e., "second degree burn"), prescribed treatment, etc. In preferable embodiments, an image of the patient's symptoms (e.g., burn, rash, etc.) is recorded during step 208.
[00107] FIG. 13 depicts an illustration of a blood collection workflow 200h in accordance with the present invention. The workflow 200h includes the registration step 202, the user identification step 204, the patient identification step 206, the object identification step 208, the content presentation step 210, the activity step 212, and the cover disposal step 214. The user 20, during step 212, collects blood from the patient 30 over the cover 108. In preferable embodiments, the image capture device 102 can capture an image or a series of images (e.g., video) of the blood collection during step 208. In a preferable embodiment of the invention, the presentation device 104 is adapted to present content (not shown), such as videos and/or live communication with a second user, to instruct the user 20 on the current procedures for collecting blood. [00108] FIG. 14 depicts a plan view of a patient physical workflow 200i in accordance with the present invention. The workflow 200i includes the registration step 202, the user identification step 204, the patient identification step 206, the object identification step 208, the content presentation step 210, the activity step 212, and the cover disposal step 214. A secondary device 116 (e.g., a blood pressure monitor) is registered with the device 106 during step 202. The monitor 116 may be positioned on the patient 30 by the user 20. In preferable embodiments, the device 106 may wirelessly communicate with the blood pressure monitor 1 16 to receive the blood pressure data of the patient 30 during step 212. In other embodiments, the device 106 may use the image capture device 102 to record an image comprising the blood pressure information from the monitor 1 16 during step 208. The presentation device 104 may be adapted to present the augmented data 114 on the cover 108 which may include patient information including blood pressure information, weight, height, blood pressure over time.
[00109] FIG. 15 depicts an illustration of a medication registration workflow 200j in accordance with the present invention. The workflow 200f includes the registration step 202, the user identification step 204, the object identification step 208, the content presentation step 210, the activity step 212, and the cover disposal step 214. During step 212, the user 30 positions an object (e.g., unregistered medication) on the cover 108 and the image capture device 102 captures an image comprising the unregistered medication 22. The user 20 may also preferably input information about the drug into the database 74, 80, 118 using the interface (not shown), including drug name, lot number, expiration date, total number in stock. The registered medication includes the information entered by the user and an image of the medication. The registered medication information may be used in future activities.
[00110] FIG. 16 depicts an example of augmented data 1 14 that may be projected on to the cover 108 during a workflow 200. In the present example, the augmented data 114 comprises patient identification 1 14a, patient physiological measures 1 14b (e.g., heart rate, blood pressure), reason for visit 114c, patient medical history 1 14d, RDT test results 114e, and a menu bar 114f for the user to adjust settings.
[00111] FIG. 17 depicts an example of batch mode or batch processing according to the present invention. Skilled readers will understand that batch processing is the execution of a series of jobs (e.g., analysis of the RDT test result) without manual intervention. In the present example, objects 22 a,b,c,d,e,f,g,h,i,j,k,l,m,n,o,p,q,e,s,t,u (e.g., malaria RDTs) are positioned on the cover 108. The device 106 automatically reads each RDT 22 a,b,c,d,e,f,g,h,i,j,k,l,m,n,o,p,q,e,s,t,u and presents relevant augmented data 1 14 on the cover 108. The augmented data 114 may be color coded to aid in visual examination. For example, RDTs 22 a,e,r,t may be associated with a negative result. RDTs 22 l,n may be associated with a positive result. RDTs 22c, o,s may be associated with alert conditions. Lastly, RDTs 22 b,d,f,g,h,I,j,k,m,p,q,u may be associated with RDTs that may not, for example, have been associated with a particular patient. Skilled persons will understand that batch processing of RDT test results can be advantageous in an outbreak situation.
[00112] In accordance with the present invention, a larger display / interaction area is preferably provided at a significantly lower cost than for a large device.
[00113] In summary, for objects 22 (e.g,. RDTs) positioned on the cover 108, the device 106 may present different colors in association with each RDT 22 to indicate a condition (e.g., incubation period and results). Fiducials 44 are preferably used by the software to register the surface. A user 20 may register an RDT using the tablet. User 20 positions RDT 22 with blood sample taken on the cover 108. The device 106 reads all RDTs 22 simultaneously. A secondary device 1 16 preferably protects the confidential information on individual RDTs. The device 116 preferably instructs the user to discard the cover which will also collect RDTs and dispose the package. The user will place a new cover 108 on the surface prior to conducting subsequent activities.
[00114] Preferably, the image capture device 102 registers a user's interaction with projected augmented data 1 14. Workflow 200 augmentation is preferably enhanced by recognizing objects 22 placed on the cover 108 prompting a response in the projected augmented data 1 14.
[00115] In accordance with the present invention, there is provided in some embodiments, a training tool for new health workers (e.g., simulation of patients and clinical cases for new health workers). In a preferred embodiment, the invention provides: training tools for new clinical protocols of new RDTs (or new medical devices); monitoring tools for patients (e.g., monitoring the patient visit, clinical protocol follow up, collection of demographic data, monitoring tools for clinical testing, RDT testing, etc.); monitoring tools for logistic and health workers management; monitoring stocks and test available; monitoring health workers performance (related to training tools).
[00116] Administrator Subsystem
[00117] In accordance with the present invention, the administrator processor 72 may conduct analyses and generate reports 76 based on the workflows 200a-j. In addition, the processor 72 may be used to interact with the work area subsystem 50. For example, the processor 72 may be used to update the work area processor 1 12 or to facilitate live communication and/or interaction between an administrator (not shown) and the user 20.
[00118] The foregoing description has been presented for the purpose of illustration and maybe not intended to be exhaustive or to limit the invention to the precise form disclosed. Other modifications, variations and alterations are possible in light of the above teaching and may be apparent to those skilled in the art, and may be used in the design and manufacture of other embodiments according to the present invention without departing from the spirit and scope of the invention. It may be intended the scope of the invention be limited not by this description but only by the claims forming a part of this application and/or any patent issuing herefrom.
[00119] Data Store
[00120] A preferred embodiment of the present invention provides a system comprising data storage (e.g. databases 1 18 in FIG. 4) that may be used to store all necessary data required for the operation of the system. A person skilled in the relevant art may understand that a "data store" refers to a repository for temporarily or persistently storing and managing collections of data which include not just repositories like databases (a series of bytes that may be managed by a database management system (DBMS)), but also simpler store types such as simple files, emails, etc. A data store in accordance with the present invention may be one or more databases, co-located or distributed geographically. The data being stored may be in any format that may be applicable to the data itself, but may also be in a format that also encapsulates the data quality.
[00121] As shown in FIGS 3 and 4, various data stores or databases 74, 80, 1 18 may interface with the system of the present invention, preferably including, without limitation, proprietary databases, epidemiologic databases, medical records databases, UN and major/international healthcare institution databases, healthcare and emergency infrastructure databases, education and economic databases, news databases, demographic databases, communication and military infrastructure databases, image databases, and weather, travel, topographic databases.
[00122] A clinical and healthcare database may preferably contain, among other things, diagnostic and medical data (clinical information), such as, for example, one or more of the following, which may or may not be related to medical events: (a) test results from diagnostic devices equipped with remote data transfer systems and/or global positioning or localization features; (b) information from UN databases and major healthcare international institutions; and/or (c) scenarios and knowledge data.
[00123] A sociological database may preferably contain, among other things, sociological data (human information), such as, for example, one or more of the following: (a) population information from local and/or international demographic databases; (b) political and/or organization systems in the area and/or from international databases; (c) education and/or economic systems in the area and/or from international databases; and/or (d) information from news and/or newspapers, drawn from the Internet or elsewhere.
[00124] An infrastructure database may preferably contain, among other things infrastructure data or information, such as, for example, one or more of the following: (a) information concerning healthcare infrastructure; (b) information concerning communication infrastructures; and/or (c) information concerning emergency and/or military infrastructure; all preferably drawn from local and/or international databases.
[00125] A geophysics database may preferably contain, among other things, geophysics data or information, such as, for example, one or more of the following: (a) weather and/or climatic information from local databases; and/or (b) topographic information from local and/or international databases.

Claims

WHAT IS CLAIMED IS:
1. A system for enhancing one or more workflows for a user, in relation to a surface and an object, wherein the system comprises:
(a) an image capture device for capturing an image of the object within an image capture area, the image comprising object data;
(b) a database comprising supplementary data associated with the object;
(c) a work area processor operative to receive the image, access the database, and automatically use the object data and the supplementary data to generate augmented data associated with the object; and
(d) a presentation device for presenting the augmented data associated with the object by displaying the augmented data on the surface within a presentation area;
whereby the system is operative to facilitate non-invasive augmentation of the one or more workflows for the user.
2. The system according to claim 1 wherein the object comprises a face, a label, an RDT, a body part, packaging, and/or a drug.
3. The system according to claims 1 to 2 wherein at least a portion of the presentation area and the image capture area overlap.
4. The system according to claims 1 to 3 further comprising a cover, supported by the surface, to facilitate registration of the presentation area and the image capture area in relation to the surface.
5. The system according to claim 4 wherein the cover comprises fiducials to enhance registration.
6. The system according to claims 1 to 5 wherein the presentation device and the image capture device are elevated in relation to the surface.
7. The system according to claims 1 to 6 wherein the augmented data is adapted for user interaction.
8. The system according to claims 1 to 7 wherein the augmented data comprises patient identification, color coding, a user interface, medical information, patient history, training information, and/or drug information.
9. A method for use in association with one or more workflows by a user, in relation to a surface and an object, wherein the method comprises the steps of:
(a) capturing an image of the object within an image capture area, the image comprising object data;
(b) electronically accessing a database comprising supplementary data associated with the object
(c) operating a work area processor to electronically receive the image and automatically use the object data and the supplementary data to generate augmented data associated with the object; and
(d) presenting the augmented data associated with the object by displaying the augmented data on the surface within a presentation area;
whereby the method operatively facilitates non-invasive augmentation of the one or more workflows for the user.
10. The method according to claim 9, wherein the object comprises a face, a label, an RDT, a body part, packaging, and/or a drug.
11. The method according to claims 9 to 10, further comprising a step, before step (a), of overlapping at least a portion of the presentation area with the image capture area.
12. The method according to claims 9 to 1 1, further comprising a step, before step (a) of providing a cover, supported by the surface, for registration of the presentation area and the image capture area in relation to the surface.
13. The method according to claim 12, wherein the cover comprises fiducials to enhance the registration step.
14. The method according to claims 9 to 13, further comprising a step, before step (a), of positioning the presentation device and the image capture device at a higher elevation relative to the surface.
15. The method according to claims 9 to 14, further comprising a step of user interaction with the augmented data.
16. The method according to claims 9 to 15, wherein the augmented data comprises patient identification, color coding, a user interface, medical information, patient history, training information, and/or drug information.
17. A non-transitory computer-readable medium on which is physically stored executable instructions which, upon execution, will enhance one or more workflows for a user in relation to a surface and an object; wherein the executable instructions comprise processor instructions for one or more processors to automatically:
(a) capture an image of the object within an image capture area, the image comprising object data;
(b) access a database comprising supplementary data associated with the object;
(c) receive the image, access the database, and automatically use the object data and the supplementary data to generate augmented data associated with the object; and
(d) present the augmented data associated with the object by displaying the augmented data on the surface within a presentation area;
to thus operatively facilitate non- invasive augmentation of the one or more workflows for the user.
PCT/CA2017/051418 2016-11-26 2017-11-24 System, method and/or computer readable medium for non-invasive workflow augmentation WO2018094534A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CA3083851A CA3083851A1 (en) 2016-11-26 2017-11-24 System, method and/or computer readable medium for non-invasive workflow augmentation

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201662426492P 2016-11-26 2016-11-26
US62/426,492 2016-11-26

Publications (1)

Publication Number Publication Date
WO2018094534A1 true WO2018094534A1 (en) 2018-05-31

Family

ID=62194578

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CA2017/051418 WO2018094534A1 (en) 2016-11-26 2017-11-24 System, method and/or computer readable medium for non-invasive workflow augmentation

Country Status (2)

Country Link
CA (1) CA3083851A1 (en)
WO (1) WO2018094534A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2438880A1 (en) * 2010-10-05 2012-04-11 Universität Bern Image projection system for projecting image on the surface of an object
US20130085774A1 (en) * 2011-10-04 2013-04-04 Yuanming Chen Semi-automated or fully automated, network and/or web-based, 3d and/or 4d imaging of anatomy for training, rehearsing and/or conducting medical procedures, using multiple standard x-ray and/or other imaging projections, without a need for special hardware and/or systems and/or pre-processing/analysis of a captured image data
US20130342350A1 (en) * 2012-06-26 2013-12-26 Stefan Popescu Method and medical imaging device for communication between a control unit and a patient and/or an operator
US20140081659A1 (en) * 2012-09-17 2014-03-20 Depuy Orthopaedics, Inc. Systems and methods for surgical and interventional planning, support, post-operative follow-up, and functional recovery tracking
US20160324583A1 (en) * 2014-10-14 2016-11-10 Leila KHERADPIR Patient reference tool

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2438880A1 (en) * 2010-10-05 2012-04-11 Universität Bern Image projection system for projecting image on the surface of an object
US20130085774A1 (en) * 2011-10-04 2013-04-04 Yuanming Chen Semi-automated or fully automated, network and/or web-based, 3d and/or 4d imaging of anatomy for training, rehearsing and/or conducting medical procedures, using multiple standard x-ray and/or other imaging projections, without a need for special hardware and/or systems and/or pre-processing/analysis of a captured image data
US20130342350A1 (en) * 2012-06-26 2013-12-26 Stefan Popescu Method and medical imaging device for communication between a control unit and a patient and/or an operator
US20140081659A1 (en) * 2012-09-17 2014-03-20 Depuy Orthopaedics, Inc. Systems and methods for surgical and interventional planning, support, post-operative follow-up, and functional recovery tracking
US20160324583A1 (en) * 2014-10-14 2016-11-10 Leila KHERADPIR Patient reference tool

Also Published As

Publication number Publication date
CA3083851A1 (en) 2018-05-31

Similar Documents

Publication Publication Date Title
US20220223276A1 (en) Systems and methods for and displaying patient data
Leslie et al. Service readiness of health facilities in Bangladesh, Haiti, Kenya, Malawi, Namibia, Nepal, Rwanda, Senegal, Uganda and the United Republic of Tanzania
Darvish et al. The role of nursing informatics on promoting quality of health care and the need for appropriate education
Goldsack et al. Evaluation, acceptance, and qualification of digital measures: from proof of concept to endpoint
EP3079569B1 (en) Combining information from multiple formats
CN103500270A (en) Visual interaction tele-medicine consultative service system
US20210174800A1 (en) Electronic health record navigation
US20120203798A1 (en) Secure medical record information system
WO2018039235A1 (en) Patient-owned electronic health records system and method
US20210098112A1 (en) System, method and/or computer readable medium for enhanced presentation and/or interpretation of images with visual feedback
Vesselkov et al. Technology and value network evolution in telehealth
US20150100349A1 (en) Untethered Community-Centric Patient Health Portal
CN112687383A (en) Hospital nursing interaction system
Albaeazanchi et al. Automated telemedicine and diagnosis system (ATDS) in diagnosing ailments and prescribing drugs
US10943700B2 (en) Method for apparatus, server and method of providing self-diagnosis result and medical information
US20170039878A1 (en) Wellness Navigation Expert System
WO2018094534A1 (en) System, method and/or computer readable medium for non-invasive workflow augmentation
Day et al. Design of a web-based and electronic health record management system for medical teleconsultation
KR102268506B1 (en) System for managing information related to epilepsy
Kaur Internet of things (IoT) and big data analytics (BDA) in healthcare
JP2002015071A (en) Health care system
Peters et al. Towards a typology for telemedical services
Yang et al. A framework for evaluating pervasive systems
JP2017076245A (en) Medical examination support system
Storck et al. Towards a trial-ready mobile patient questionnaire system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17874173

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17874173

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 3083851

Country of ref document: CA