AU2007361697B2 - Virtual human interaction system - Google Patents

Virtual human interaction system Download PDF

Info

Publication number
AU2007361697B2
AU2007361697B2 AU2007361697A AU2007361697A AU2007361697B2 AU 2007361697 B2 AU2007361697 B2 AU 2007361697B2 AU 2007361697 A AU2007361697 A AU 2007361697A AU 2007361697 A AU2007361697 A AU 2007361697A AU 2007361697 B2 AU2007361697 B2 AU 2007361697B2
Authority
AU
Australia
Prior art keywords
virtual
user
patient
case
decision
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
AU2007361697A
Other versions
AU2007361697A1 (en
Inventor
Luke Bracegirdle
Stephen Chapman
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Keele University
Original Assignee
Keele University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Keele University filed Critical Keele University
Publication of AU2007361697A1 publication Critical patent/AU2007361697A1/en
Application granted granted Critical
Publication of AU2007361697B2 publication Critical patent/AU2007361697B2/en
Priority to AU2013206341A priority Critical patent/AU2013206341A1/en
Ceased legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H10/00ICT specially adapted for the handling or processing of patient-related medical or healthcare data
    • G16H10/60ICT specially adapted for the handling or processing of patient-related medical or healthcare data for patient-specific data, e.g. for electronic patient records

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Epidemiology (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Primary Health Care (AREA)
  • Public Health (AREA)
  • Medical Treatment And Welfare Office Work (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A virtual human interaction system is described for use on a web-enabled computer which facilitates the training and education of medical services practitioners such as doctors, nurses, pharmacists and the like by allowing them to virtually interact with a virtual patient delivered by the system and displayed on the computer screen. The system embodies a plurality of cases, and for each case, there are a number of possible outcomes, depending on the choices made by the medical services practitioner at each stage in a particular case. Such choices are made in the form of user input to the system through the computer, and each case consists of a decision tree element consisting of branch elements from which the decision tree may branch in one or more directions toward further branch elements or tree termini, the user input cause the system to move through the decision tree of the case.

Description

WO 2009/068838 PCT/GB2007/050719 1 VIRTUAL HUMAN INTERACTION SYSTEM This invention relates to a virtual human interaction system, and more particularly to a virtual human interaction system capable of being provided over local or disparate 5 computer networks to users at one or more terminals and whereat users are presented with a situation involving one or more humans, which are virtually represented on screen at said terminal, and with which said users must interact by providing one or more inputs at the terminal. 10 More specifically, this invention relates to a virtual patient system ideally intended for trainee medical practitioners to help them to learn or enhance their diagnostic skills based on a simulated doctor/patient scenario which is virtually represented on screen. Of course, while the following description relates almost exclusively to the use of the invention in the medical industry for the purpose specified, the reader will instantly 15 become aware that the invention has a potentially much wider application in the training and education fields generally, and therefore the invention should be considered as encompassing such applications. BACKGROUND 20 Virtual education and/or training systems which involve some type of background computer program coupled with images and/or video files (e.g. mpeg, avi and the like) for display on-screen are well established. Furthermore, such systems can be provided both locally, in terms of being provided and loaded on an individual, stand-alone, non 25 networked PC, and in distributed fashion, whereby the system is stored centrally and delivered either physically in terms of being downloadable to suitably networked PCs, or virtually in terms of the program being executable at the server side and the results of the execution (which is to some extent controlled by the user input at the networked PC) are then transmitted by HTML or other suitable format so that the display on the user's 30 PC can be caused to change as program execution continues. Indeed there are many examples of such systems. One example can be found at http://medicus.marshall.edu/ and is entitled "The Interactive Patient". In this system, which is presented over the internet in HTML format, the user clicks through a series of 35 stages, such as "History", "Physical Exam", "X-Ray Exam", and "Diagnosis", and with each stage that is clicked, a web page is presented to a user on which some WO 2009/068838 PCT/GB2007/050719 2 explanatory text concerning the condition, symptoms, and medical history of a virtual patient is provided, together with a static, real-life photo image of a doctor greeting, examining, interrogating or otherwise dealing with a patient. Of course, both doctor and patient may be represented by actors, and in the case of systems where video footage 5 is provided to users, such actors would be previously instructed how to behave during filming according to the particular notional plight of the patient, e.g. the actor playing the patient is told to limp as a result of having a notional sprained ankle. This system is typical of many available on the web, in that a student is presented with 10 a patient case to read, optionally provided with some patient medical history or medical records, and is then presented with a number of related options. Such systems are fundamentally limited in that they can relate only to one possible situation. For example, the user will be presented with a case to which the photos or video footage used are exclusively appropriate. Additionally, the text used in describing the case is 15 most likely to be hard-coded into the website being provided, with the result that a total re-design is required if such systems are to be useful in training users in other situations. Indeed, in the case of the training of medical practitioners, it is almost imperative that they be exposed to as many different cases and patient diagnosis scenarios as is possible to provide them with as well rounded and comprehensive a 20 training as is possible. In this regard, the type of system immediately previously described is wholly inadequate. Other systems of this type can found at: http://courses.pharmacy.unc.edu/asthna/ 25 http://nedQuides.nedicines.org.uk/ai/ai.aspx?id=AI 005&name=Becotide http://research.bidrmc.harvard.edu/VPTutorials/ http://radiography.derby.ac.uk/NOS Conference/Dawn%20Skelton%202.pdf As an advance on the above, it has been proposed to use virtual reality to enhance the 30 training/user experience. A technical paper entitled "Virtual Patient: a Photo-real Virtual Human for VR-based Therapy" by Bernadette KISS, Balezs BENEDEK, Gebor SzIJARTO, Gebor CSUKLY, Lajos SIMON discussed a high fidelity Virtual Human Interface (VHI) system which was developed using low-cost and portable computers. The system features realtime photo-realistic digital replicas of multiple individuals 35 capable of talking, acting and showing emotions and over 60 different facial expressions. These "virtual patients" appear in a high-performance virtual reality WO 2009/068838 PCT/GB2007/050719 3 environment featuring full panoramic backgrounds, animated 3D objects, behaviour and A.I. models, a complete vision system for supporting interaction and advanced animation interfaces. The VHI takes advantage of the latest advances in computer graphics. As such, it allows medical researchers and practitioners to create real-time 5 responsive virtual humans for their experiments using computer systems priced under $2000. In this document, the creation of Computer generated, animated humans in real-time is used to address the needs of emerging virtual-reality based medical applications, such 10 as CyberTherapy, virtual patients, and digital plastic surgery. The authors developed an open architecture, low-cost and portable virtual reality system, called the Virtual Human Interface, that employs high resolution, photo-real virtual humans animated in real-time to interact with patients. It is said that this system offers a unique platform for a broad range of clinical and research applications. Examples include virtual patients 15 for training and interviewing, highly realistic 3D environments for cue exposure therapy, and digital faces as a means to diagnose and treat psychological disorders. Its open architecture and multiple layers of interaction possibilities make it ideal for creating controlled, repeatable and standardized medical VR solutions. By virtue of the system proposed, the virtual patients can talk, act and express a wide range of facial 20 expressions, emotions, and body gestures. Their motions and actions can be imported from MPEG4 or motion capture files or animated. An additional scripting layer allows researchers to use their own scripting controls implemented in XML, HTML, LUA, TCL/TK or TCP/IP. 25 They state that the system also runs in a browser environment over the Internet and supports multiple ways, such a live video feed, remote teleconferencing and even a virtual studio module (chroma-key) for the therapist to enter the patient's virtual space whether locally or remotely. 30 Despite the obvious advantages of providing a virtual patient as described above, and in particular the utility of such a system in bringing virtual doctor/patient encounters to life on screen, there are still drawbacks in that this system requires the designer of particular cases to redesign each case with new video or motion-capture animations every time a new response is required from the virtual patient. As will immediately be 35 appreciated, this represents a massive overhead, and a probably unworkable solution to the problem of providing trainees with a great variety of cases to study. Additionally, WO 2009/068838 PCT/GB2007/050719 4 users of a system of this type would need to have a high level of technical skills in order to design a new patient case. However, a particular drawback is that a simulation of this type can only at best simulate the developer's point of view, or research findings. This raises questions about how to simulate accurately the collective viewpoints of a 5 series of subject-area experts, or to simulate the current evidence-base for a specific domain and demonstrate to the learner the probable results their decisions in treating the patient (e.g. how do you simulate to the student that if their action in the simulation was taken with a real-life patient, it would have had a 63% probability to harm the patient). Other key messages based on published evidence for a certain therapeutic 10 area (and not on personal opinion) need to be conceptualised in order for a simulation to take on a greater value At this point, it will be beneficial to consider the existing work in the field of decision analysis. Systems based on decision analysis are currently known, and one example 15 can be found at: http://www.hud.ac.uk/schools/hhs/depa nts/nursinLtpenfield site/default.htm This system, developed by the University of Huddersfield, UK, creates a "virtual 20 hospital" in HTML and other code, and is a computer-based learning tool for health care professionals that simulates the care context for patients within the environmental context of a General Hospital. The system has been built from components that reflect typical aspects of a care 25 environment e.g. patients, patient assessment forms, observations records etc. The components provide the facilitator with the means to support complex and imaginative patient based scenarios to satisfy key learning outcomes. The system is innovative in design and delivery of course materials as it encourages students to engage with nursing matter through vignettes and patient based scenarios within the context of 30 application to patient care; and allows students to explore wider issues relating to management of the care environment through duty roster and resource management and exploration of evidence based practice. In this system however, cartoon-like animations are provided on-screen as opposed to 35 virtual full-size patients, and although the system provides a good overall "feel", it requires each case to be designed ab initio in advance, rather than certain decisions 5 each resulting in an ad hoc animation from a virtual patient. It is also deficient in that it allows little scope for extensive development, for example by being scalable to include vast numbers of doctor/patient cases or prognosis/diagnosis scenarios. 5 Another system is known from US 2005/0170323, which discloses a computer system in which normal data indicating normal conditions in a patient is stored together with abnormality data received from an author, a medical knowledge base, and a mentoring knowledge base. An instance of a virtual patient is generated based on the normal data and the abnormality data, the instance describing a sufficiently comprehensive 10 physical state of a patient having the abnormal condition to simulate clinical measurements of the patient's condition. Action data is received from a trainee who is different from the author. Action data indicates a requested action relevant to dealing with the instance. Response data is generated based on the action data and the instance. Display data is presented to the trainee based on the response data. The 15 display data indicates information about the instance available as a result of the requested action. The system does not provide any teaching in relation to the generation, acquisition and utilisation of computer generated media that combine clinical experience and published evidence. Moreover, the system requires the active input of the author, whose online presence is therefore needed for full functionality. 20 A further disadvantage of all these systems is that none provide a medium whereby an educator can design his own cases for teaching purposes without some computer programming experience. 25 Embodiments of the present invention may provide a system which overcomes at least some of the disadvantages of the prior art and provides a means whereby a given scenario can be analysed by a student or the like, and then allows some action to be taken (e.g. diagnosis and/or prognosis or changes to medication or lifestyle) with the effects of that action being visualised in real time in combination with virtual reality 30 technology for representing humans on-screen to provide a remarkable learning experience. It is intended to combine some of the inherent benefits of current technology and propose an innovative system and method to provide an evidence based simulation, that provides immediate feedback to the user via a virtual patient computer-generated character which is not limited to any particular platform, but may 35 be implemented on various different platforms using the technology resident on many modern computer systems.
6 Any discussion of documents, acts, materials, devices, articles or the like which has been included in the present specification is not to be taken as an admission that any or all of these matters form part of the prior art base or were common general 5 knowledge in the field relevant to the present disclosure as it existed before the priority date of each claim of this application. BRIEF SUMMARY OF THE DISCLOSURE 10 According to a first aspect of the present invention, there is provided a virtual human interaction system having a user terminal, said system being adapted to be capable of providing a plurality of virtual situations to a user, each virtual situation being capable of providing a variety of outcomes to the user, user input to the system determining which outcome is provided, said system comprising at least: 15 a respective case file for each virtual situation, each case file consisting of at least a decision tree element consisting of branch elements from which the decision tree may branch in one or more directions toward further branch elements or tree termini, each branch element and terminus including descriptors of a particular 20 condition of the virtual human in the specific virtual situation, the user navigating, in use, the decision tree element during interaction with the system; a virtual human representation element capable of being displayed graphically to appear as a virtual human on screen; and 25 a plurality of appearance descriptor elements, respectively associated with the branch elements or tree termini, for interacting with the virtual human representation element so that the appearance of the virtual human representation element reflects where in the decision tree element the user has navigated to, said appearance 30 descriptor elements being based on real-life human conditions which affect the physical appearance of humans generally and which are thus mimicked in the virtual human representation element; and a decision engine which is arranged to cause the appearance of the virtual 35 human to change by applying one or more of said appearance descriptor elements to said virtual human representation element when, as a result of user input at said 7 terminal, the user navigates from one of the branch elements or termini of the decision tree having a descriptor element associated with it to another branch element or terminus of the decision tree having a respectively different descriptor element associated with it; wherein 5 the decision engine is arranged: to parse the case file and to render the content into a machine readable format, 10 to call external resources as necessary depending on to where in the decision tree element the user has navigated, to render each case file, and any external resources which have been 15 called, and format such for provision to the user, to track the decisions taken by a user when navigating the decision tree element, and 20 to pass data on the tracked decisions to a database for recordal; and wherein the database is arranged to track the decisions which are taken, in use, within each case, and to keep a record of the location of the external resources which are 25 available to be called by the decision engine. The virtual human element may be provided in the form of a virtual patient displayed on-screen at the user terminal. 30 In the system, the virtual patient may be arranged to suffer from a predetermined ailment or condition, the branch element or termini descriptors, which may be complete or incomplete as far the condition of the virtual patient is concerned, may be arranged to be provided successively by the decision engine to the user as information which should be indicative of, or at least suggestive of the particular ailment or 35 condition, and the decision engine may be arranged to provide one or more options to the medical services practitioner from which a selection of one or more options is 7a made, said options being returnable to the decision engine to cause the decision engine to move to the next branch element or terminus in the decision tree, to display the next descriptor associated therewith, and to cause the appearance of the virtual patient to change if so demanded by the case and the previous user input. 5 Embodiments of the present invention may differ from the prior art systems discussed hereinbefore in several important aspects: First, animations are invoked at a code-level based on the interaction with the user. 10 This results in an asynchronous dialogue between the user and virtual human (in the examples given below, reference will be made to "virtual patient", but it will be understood that embodiments of the invention may apply to situations other than healthcare applications), which allows the tool to be used without the need for a second human operator. This increases the number of applications available to 15 embodiments of the invention, as it can be used as a virtual patient 'on demand' or even as a virtual patient across a distributed network, such as the Internet. Embodiments of the present invention allow the interchanging of patients for a specific case, for example by changing the sex, ethnicity, age or apparent social background of the virtual patient. This opens up the field of study to evaluate decision making by the 20 user should the patient differ in their gender, ethnicity, age, social background etc.
WO 2009/068838 PCT/GB2007/050719 8 Secondly, embodiments of the present invention differ in their ability to convert real world experience and published evidence into a machine readable format. It is this format that invokes the suitable animation and audio from the virtual patient, based on 5 the user's interaction. The research basis for this design is fundamentally founded on the development of decision analysis technology, which allows this real-world information to be stored in an efficient manner. Although the use of decision analysis on its own is not new, embodiments of the present invention add an additional step by adding to the process the ability to convert real-world information into a machine 10 readable format, by way of decision analysis techniques. In particular, a decision engine may be provided so as to parse a data file (e.g. in human-readable XML format or other human-readable format) and to convert this into a machine-readable format using decision analysis techniques. 15 The primary benefit of including decision analysis into embodiments of the present invention is it allows the simulation of the real-world evidence in a setting that can be conceptualised easily (e.g. a patient sitting in a surgery and exhibiting side effects consistent with the published evidence). Therefore it should be noted that embodiments of the present invention differ in a third way, by their ability to simulate the 20 evidence for a large cross-population of experts. In other words, rather than simulating the point of view of a single designer (or team of designers) for a virtual patient case based on a set of personal opinions, the collective point of view, or individual points of view, of a large population of experts (taken from, for example, published literature) may be used as data for decision analysis and outcomes of certain decisions made by 25 the user. This has the added benefit providing the user with tailored feedback based on decisions taken in line with the evidence and experience related to a virtual patient's case. A fundamental advantage of this invention is its ability to identify starkly, through the 30 appearance of the on-screen virtual patient, exactly what the effect on such a patient would have been in real-life had the user acted in the way he did during the virtual case study provided by the system. For example, if case offered the option to the user of prescribing various drugs to treat the virtual patient's condition, and the user chose the wrong drug, the system could, almost in real-time, display the (possibly fatal) effects of 35 incorrect prescription. For instance, a set of descriptors (possibly code fragments, mini applications, or other graphics tools) could be applied to the virtual patient to cause the WO 2009/068838 PCT/GB2007/050719 9 displayed figure to faint, vomit, turn different shades of colours, become blotchy, sweat, become feverish, collapse, and possibly ultimately, die. Of course, many other conditions can be defined and described in suitable code, tools, applications or other format compatible with the system. Moreover, by providing descriptors that replicate 5 particular emotions, a more realistic and effective simulation of real-life can be obtained. For example, descriptors may be configured to simulate embarrassment, pain, relief, happiness, sadness, anger etc. in response to particular questions or classes of questions raised by, or particular actions taken by, the user when interacting with the system. 10 The virtual patient preferably takes the form (to the user) of an animated avatar, preferably rendered so as to have a three dimensional appearance (albeit, with current technology, on a two dimensional display). 15 It is worth mentioning at this time that the effects of the system on test candidates is so startling that most remember the experience very clearly. When compared to studying comparatively dry and dull textbooks, the system provides a marked improvement. One of the reasons behind this improvement is that the patient condition descriptors (i.e. "the faint", "the vomit", "the collapse", "the death") is independent of the particular virtual 20 patient representation. Accordingly, any virtual reality figure can be incorporated into the system, and it is to this basic figure that the various conditions can be applied. Not only does this make the system very flexible (for instance, it is thus very simple to change the virtual representation from a man to a woman), but also it provides the system as a whole with advanced realism. For example, it could easily be possible to 25 virtually represent someone the user knew in real life, which would further enhance the experience of using the system. Embodiments of the present invention also allow for the provision of feedback from the virtual patient based on the routes taken through the decision tree. This allows users to 30 receive advice on how their decision path differed from published evidence or peers (for example) via a range of feedback tools. In some embodiments, this feature is implemented in various ways, ranging from a text transcript of the decision path to the virtual patient 'speaking' to the user at the conclusion of the virtual patient consultation and providing a critique of the user's performance. In particular, feedback may include 35 a presentation of data, by the virtual patient, that was not elicited from the system by the user during interaction with the virtual patient. For example, if the user makes an WO 2009/068838 PCT/GB2007/050719 10 incorrect diagnosis, or advises an incorrect treatment, thereby taking a path along the decision tree that does not result in good treatment for the condition exhibited by the virtual patient, the virtual patient may (after the consultation) present an explanation as to what the correct path though the decision tree should have been, and why. 5 A student or other user may interact with the system in a variety of ways, depending on the platform chosen for development of a virtual patient case. Web-based cases can use multiple-choice questions or textual analysis of free text inputted by the student or user. In some embodiments of the present invention, commercial speech recognition 10 software may be employed to allow voice interaction with the virtual patient. By providing speech or voice recognition and processing capability (which in itself is known, and which will therefore not be described in full detail), it is possible for a user to direct spoken questions to the system in such a way as to simulate a real-life interaction, with the avatar representation of the virtual human responding in various 15 ways, for example talking and moving, to questions or instructions spoken by the user. With currently available speech recognition systems, some degree of training is required so that the system can process and understand a given user's speech, but it is expected that this will become less important in the relatively near future as speech recognition technology improves. The system may accordingly include a microphone or 20 the like and speech recognition processor for input of voice commands and questions. Although the general disclosure of preferred embodiments of the present invention is directed primarily to healthcare applications, other embodiments could equally be applied to any situation in education where there is an evidence-base or experience 25 documented regarding the interaction with humans. Examples could include a simulator to improve communication skills with virtual customers, a simulator for helpdesk staff to advise virtual customers of a specific course of action (e.g. help-desk staff training) or even a simulator where the student takes on various other roles, such as a pharmacist speaking to a virtual doctor, a tutor speaking with a virtual student or a 30 employee speaking to a virtual manager during an appraisal. Throughout the description and claims of this specification, the words "comprise" and "contain" and variations of the words, for example "comprising" and "comprises", means "including but not limited to", and is not intended to (and does not) exclude other 35 moieties, additives, components, integers or steps.
WO 2009/068838 PCT/GB2007/050719 11 Throughout the description and claims of this specification, the singular encompasses the plural unless the context otherwise requires. In particular, where the indefinite article is used, the specification is to be understood as contemplating plurality as well as singularity, unless the context requires otherwise. 5 Features, integers, characteristics, compounds, chemical moieties or groups described in conjunction with a particular aspect, embodiment or example of the invention are to be understood to be applicable to any other aspect, embodiment or example described herein unless incompatible therewith. 10 BRIEF DESCRIPTION OF THE DRAWINGS For a better understanding of the present invention and to show how it may be carried into effect, reference shall now be made by way of example to the accompanying 15 drawings, in which: FIGURE 1 provides a diagrammatic representation of the system as a whole, and FIGURE 2 shows a possible decision tree structure suitable for a case involving a 20 patient who is an asthma sufferer. DETAILED DESCRIPTION As a first part of this description, the method by which cases are designed for the 25 system is described. The first step in designing a case is Patient Selection. In designing a new case for use with the system according to the invention, it needs to focus on a single patient. Different patients can be used for individual cases, and information on other people 30 (e.g. family members) can be provided in the branch element/terminus descriptors if relevant. During this phase of development, it is necessary to describe the patient's profile. The system requires information about the patient such as their description (gender, age, 35 height, weight etc.), previous medical history and any social history. It is perceived by the applicant herefor that after a number of cases have been designed, the system may WO 2009/068838 PCT/GB2007/050719 12 be extended to develop a 'patient population' - a small set of patients that can be perceived as members of a virtual community. Such a resource would allow case designers to select a patient from the patient population or examine the effect of their decision across the entire virtual patient population. 5 A particular case is created by designing a number of scenarios that are linked by the decisions that can be taken. This relates to the decision tree aspect of the invention, which may most usefully be mapped out in a flowchart or organisational chart, such as that shown in Figure 2. Each of the boxes can be thought of as branch elements (i.e. 10 elements from which a branch extends or is possible) or termini (i.e. from which no further branch is possible). In each box, there is provided some text indicative of the patients physical state at that stage in the diagnosis procedure. Also provided in each branch element are a series of options or other means by which a user can enter information or make a selection. This user input is then analysed by the system to allow 15 it to determine, according to the decision tree, which branch element to display next. As can be seen in the Figure, a case will often have more than one final scenario, depending on the various options which are offered to and chosen by the user. 20 A case is made up of many scenarios which need to be described individually to support the decisions that can be taken. To begin writing a case, it is necessary to consider the following pieces of information for each scenario: Audience 25 The type of student for whom the case is designed (e.g. Pharmacy, Medical, Nursing students). As a case has many scenarios, the audience does not always have to be the same for each scenario. For example by changing the audience, it is possible to design a case to allow a group of students from various health disciplines to work together on a single case. 30 Description Each scenario must be fully described as it will appear to the student (e.g. Anne walks into the pharmacy complaining of...). 35 WO 2009/068838 PCT/GB2007/050719 13 Additional Information The system can easily be adapted to provide additional information in the form of attachments to the user, so word documents, http links, specific pictures and the like can be included, and the system can refer the student to support their decision for each 5 scenario. Decisions Unless an outcome scenario is being described for a case, it is necessary to provide two or more decisions for each scenario described, together with branch information, 10 i.e. where each decision should lead. A simple numbering scheme for the scenarios would allow one scenario to reference another. In this manner, it is of course possible to reuse scenarios, so that a number of decisions can result in the display of the same scenario. In the system, it may also be necessary to categorise each decision into three types. If 15 the system is used for a healthcare application, for example, all decisions may typically be broadly categorised as (a) treating the patient, (b) referring the patient to another healthcare professional or (c) giving advice to the patient. Multimedia 20 With each scenario, it is possible (although not mandatory) to request visualisation of one or more key points of the scenario through virtual patient technologies incorporated into the system. This feature may invoke an animation based on the user's interaction with the system, and can therefore request a response from the virtual patient based on what has been said. 25 An example of a case description, which be parsed by a Decision Engine so as to convert it into a machine-readable format, is provided below. Patient Description 30 Retired Teacher, Caucasian, Weight = 88kg Married, husband a heavy smoker Two Children (Luke and Jessica), now 31 and 29 35 Anne has suffered with asthma since childhood, suffering 3 exacerbations in the past 12 months. She had a total hysterectomy at age 48, menorrhagia & prolapse FH of WO 2009/068838 PCT/GB2007/050719 14 CVD. Her mother died following a stroke at age 76, prior to this she had a succession of TIAs and had moved in with Anne and her husband. Husband worked in management for the local coal board and was retired on grounds of ill health (arthritis) in 1996 (age 62). 5 Anne buys analgesics regularly from the local pharmacy for her husband (Co-Codamol 8/500) as he doesn't like to bother the GP for such 'minor' medications. Anne doesn't have any help at home, she does her own cooking and cleaning and when her asthma is ok her mobility and exercise tolerance is good. She was advised to 10 increase her activity levels a few years ago and has started to walk the dog more since her husband is becoming increasingly unable to walk long distances without significant pain. 15 Scenario Number: 1 Audience: Pharmacy Students & Medical Students Only Description Anne has been asthmatic for many years. She has attended her review annually at the surgery and her prescription has stayed pretty constant for some time. There have 20 been a number of acute exacerbations of her asthma in the past 12 months and she attends for review at the surgery following an Accident and Emergency admission 5 days previously... Additional Information [include word documents, links etc. to additional resources for student consideration] 25 Decisions Category: (a) treating the patient increase Beclometasone to 250mcg 2 puffs bd MDI (goes to scenario 3) 30 Category: (c) giving advice to the patient check inhaler technique (goes to scenario 4) Category: (a) treating the patient switch to Easi-breathe devices (goes to outcome scenario 2) WO 2009/068838 PCT/GB2007/050719 15 Multimedia Invokes an animation of patient attending her annual review and taking a peak flow measurement. 5 Feedback [This section would be used to record the feedback that the user should be given, by virtue of navigating to this node in the decision tree. For example, if the node detailed the prescription of an incorrect medication, the feedback might contain information on recommended alternative drugs as well as advice that this action was not advised. 10 Such feedback would be recorded and reported back to the user at the end of the consultation.] [the description of the remaining 9 scenarios in this case is precluded here in the interest of brevity, but the format is generally similar to the above] . 15 ----------------------------------------------------------------------------------------------------------- In order to the present the pre-designed cases in an informative, useful and striking manner, the system is designed as follows. 20 Referring to Figure 1, the system 2 consists of three main components to deliver the core functionality. These are referred to as: (1) The Patient Case File 4 - this is an XML based file that drives the content in each case. The file format can be generated with supporting applications to allow case 25 designers with little, or no technical knowledge to create new cases for use with the system. This file uses a XML definition to allow the decision engine to parse the file and process its contents. The files employ a decision tree to traverse the various scenarios a patient 30 case may have, depending on the decisions taken within a case. (2) The Decision Engine 6 - this is responsible for parsing the Patient Case File and rendering the content into a machine readable format. The decision engine 6 is also responsible for calling external resources 8 that the case may need to render the case 35 (e.g. documents, images, animations/sound files) and then formats the case back to the user via a standard output format (e.g. web page).
WO 2009/068838 PCT/GB2007/050719 16 In accordance with embodiments of the invention, the external resources 8 also include the descriptors which can be applied to the virtual patient, the computer-readable representation of which is similarly retained in the database. 5 The engine also tracks the decisions taken by a user in each case and then passes this data onto a database 10 for recording. This information is then used when a user wishes to examine a transcript of what decisions the user made for a specific case. 10 (3) The Database 10 - the database is responsible for tracking decisions taken within each case (and ultimately to deliver feedback to the user, where the feedback functionality is provided) and to keep a record of the location of external resources that may be required to render a case (e.g. animation files). 15 The database is also referred to when a user wishes to recall their decisions within a case. This information is also used at a higher level, so that case designers can examine what type of decisions are being made in their case and if additional supporting information needs to be supplied to the user to improve the decision making process. 20 At a technical level, to allow the decision engine to parse the XML file so that the system can provide this functionality, information is declared in the XML file as a series of special XML tags. 25 At the start of the file, a tag is declared identifying the patient the case applies to: <patient id=01>Anne Phillips... Each scenario is then declared via series of scenario tags that describe what is happening to the patient at this stage of the case. Typically, one would expect to see a 30 series of scenario tags to make up the various scenarios of each case. <scenario0l >Anne complains of breathlessness.....what do you do? </scenario0l> Within each scenario, additional information can be provided to the user (via hyperlinks) 35 before they make their decision. This is declared in the file as follows: WO 2009/068838 PCT/GB2007/050719 17 <scenario0llink url="CMP.doc"> CMP </scenario0l link> <scenario0l link url="http://www.sign.ac.uk"> 5 SIGN/BTS Guidelines </scenario0l link> <scenario0l link url="http://www.nice.org.uk"> NICE </scenario0l link> 10 Decisions are then declared, being those decisions applicable to the particular scenario. Each of these decisions is categorised via the "Type" attribute and is recorded back to the database accordingly. scenario0 option type="a">increase Beclometasone to 250mcg... 15 </scenario0l option> <scenario0l option type="b">check inhaler technique </scenario0l option> <scenario01 option type="c">switch to easi-breathe devices </scenario0l option> 20 As a next part of the file, the decisions are mapped to paths within the decision tree to allow the case to traverse the tree correctly. Each scenario is made anonymous by an identification (ID) value and referenced in the XML file thus: scenario0 1path>02</scenario0 path> 25 scenario0 1path>03</scenario0 path> scenario0 1path>04</scenario0 path> A tag is also included in the XML file which calls an external multimedia resource, and in particular an emotional or physical descriptor file which can be applied to a default 30 virtual human (e.g. avatar) in memory, in accordance with embodiments of the invention. This may be an image file, sound file or an animation to cause the avatar to respond in a predefined way. This may involve using a file from an external media suppler and can be declared in the XML file as follows: <scenario0l resource file="patientOl/emotions/pain.flv">02 35 </scenario0l resource> WO 2009/068838 PCT/GB2007/050719 18 Such animation files need to be designed before the XML file can reference them. However animations are designed and can be invoked at a code level and applied to different patients. Therefore it is possible for the invention to call on a database of animations (using a combination of external and in-house developed multimedia 5 resources) to invoke an emotion in the patient across a number of cases. Thus, for common actions (e.g. smiling, angry, sad), these could be designed for all patients in one process and therefore allows for an extensive population of animations which the XML file can reference via this tag. 10 It is important to note at this point that supporting software applications can be used which guide a designer through writing a case. This software will automatically generate the XML required in a Patient Case File without the user being exposed to the raw XML file format. This allows a case designer to create his/her own case without 15 requiring knowledge of specialist programming languages. In summary therefore, a virtual human interaction system is described for use on a PC or web enabled computer, which facilitates the training and education of users. Its initial application is directed to healthcare practitioners such as doctors, nurses, 20 pharmacists and the like by allowing them to virtually interact with a virtual patient delivered by the system and displayed on a computer screen, although other applications outside the healthcare field may be envisaged. The system embodies a plurality of cases, and for each case, there are a number of possible outcomes, depending on the choices made by the healthcare services practitioner at each stage in 25 a particular case. Such choices are made in the form of user input to the system through the computer interface, and each case consists of a decision tree element consisting of branch elements from which the decision tree may branch in one or more directions toward further branch elements or tree termini, the user input cause the system to move through the decision tree of the case. Each branch element and 30 terminus includes descriptors of a particular condition of the virtual human at that time in the specific case, and these are displayed to the user at each specific stage in the case to provide the user with a current indication of the well being of the virtual patient. Together with the virtual patient displayed by the system, also incorporated into the system are a plurality of appearance descriptors which can be applied to the virtual 35 patient by the system so as to cause a change in the appearance thereof, said descriptors being based on real-life human conditions which affect the physical WO 2009/068838 PCT/GB2007/050719 19 appearance of humans generally and which are thus mimicked in the virtual patient. In accordance with the invention, the system causes the appearance of the virtual patient to change by applying one or more of said appearance descriptors to said virtual patient as the system moves through the decision tree in response to user input. The resulting 5 effect is to provide users with an almost real-time indication of their actions on patients.

Claims (4)

1. A virtual human interaction system having a user terminal, said system being adapted to be capable of providing a plurality of virtual situations to a user, each virtual 5 situation being capable of providing a variety of outcomes to the user, user input to the system determining which outcome is provided; said system comprising at least: a respective case file for each virtual situation, each case file consisting of at least a decision tree element consisting of branch elements from which the decision tree 10 may branch in one or more directions toward further branch elements or tree termini, each branch element and terminus including descriptors of a particular condition of the virtual human in the specific virtual situation, the user navigating, in use, the decision tree element during interaction with the system: a virtual human representation element capable of being displayed 15 graphically to appear as a virtual human on screen; a plurality of appearance descriptor elements, respectively associated with the branch elements or tree termini, for interacting with the virtual human representation element so that the appearance of the virtual human representation element reflects where in the decision tree element the user has navigated to, said appearance 20 descriptor elements being based on real-life human conditions which affect the physical appearance of humans generally and which are thus mimicked in the virtual human representation element; and a decision engine which is arranged to cause the appearance of the virtual human to change by applying one or more of said appearance descriptor elements to 25 said virtual human representation element when, as a result of user input at said terminal, the user navigates from one of the branch elements or termini of the decision 20 tree having a descriptor element associated with it to another branch element or terminus of the decision tree having a respectively different descriptor element associated with it; wherein the decision engine is arranged: 5 to parse the case file and to render the content into a machine readable format, to call external resources as necessary depending on to where in the decision tree element the user has navigated, to render each case file, and any external resources which have been 10 called, and format such for provision to the user, to track the decisions taken by a user when navigating the decision tree element, and to pass data on the tracked decisions to a database for recordal; and wherein 15 the database is arranged to track the decisions which are taken, in use, within each case, and to keep a record of the location of the external resources which are available to be called by the decision engine.
2. A system according to any preceding claim wherein the virtual human element is 20 provided in the form of a virtual patient displayed on-screen at the user terminal.
3. A system according to claim 2 wherein the virtual patient is arranged to suffer from a predetermined ailment or condition, the branch element or termini descriptors, which may be complete or incomplete as far 25 the condition of the virtual patient is concerned, are arranged to be provided successively by the decision engine to the user as information which should be 21 indicative of, or at least suggestive of the particular ailment or condition, and the decision engine is arranged to provide one or more options to the medical services practitioner from which a selection of one or more options is made, said options being returnable to the decision engine to cause the decision engine to move to the next 5 branch element or terminus in the decision tree, to display the next descriptor associated therewith, and to cause the appearance of the virtual patient to change if so demanded by the case and the previous user input.
4. A virtual human interaction system substantially as described herein with 10 reference to Figs. 1 and 2. 22
AU2007361697A 2007-11-27 2007-11-27 Virtual human interaction system Ceased AU2007361697B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2013206341A AU2013206341A1 (en) 2007-11-27 2013-06-14 Virtual human interaction system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/GB2007/050719 WO2009068838A1 (en) 2007-11-27 2007-11-27 Virtual human interaction system

Related Child Applications (1)

Application Number Title Priority Date Filing Date
AU2013206341A Division AU2013206341A1 (en) 2007-11-27 2013-06-14 Virtual human interaction system

Publications (2)

Publication Number Publication Date
AU2007361697A1 AU2007361697A1 (en) 2009-06-04
AU2007361697B2 true AU2007361697B2 (en) 2013-03-21

Family

ID=39267904

Family Applications (1)

Application Number Title Priority Date Filing Date
AU2007361697A Ceased AU2007361697B2 (en) 2007-11-27 2007-11-27 Virtual human interaction system

Country Status (2)

Country Link
AU (1) AU2007361697B2 (en)
WO (1) WO2009068838A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115762688A (en) * 2022-06-13 2023-03-07 人民卫生电子音像出版社有限公司 Super-simulation virtual standardized patient construction system and diagnosis method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6692258B1 (en) * 2000-06-26 2004-02-17 Medical Learning Company, Inc. Patient simulator
US20040064298A1 (en) * 2002-09-26 2004-04-01 Robert Levine Medical instruction using a virtual patient
US20040121295A1 (en) * 2002-12-20 2004-06-24 Steven Stuart Method, system, and program for using a virtual environment to provide information on using a product
WO2005055011A2 (en) * 2003-11-29 2005-06-16 American Board Of Family Medicine, Inc. Computer architecture and process of user evaluation
US6972775B1 (en) * 1999-11-01 2005-12-06 Medical Learning Company, Inc. Morphing patient features using an offset

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6972775B1 (en) * 1999-11-01 2005-12-06 Medical Learning Company, Inc. Morphing patient features using an offset
US6692258B1 (en) * 2000-06-26 2004-02-17 Medical Learning Company, Inc. Patient simulator
US20040064298A1 (en) * 2002-09-26 2004-04-01 Robert Levine Medical instruction using a virtual patient
US20040121295A1 (en) * 2002-12-20 2004-06-24 Steven Stuart Method, system, and program for using a virtual environment to provide information on using a product
WO2005055011A2 (en) * 2003-11-29 2005-06-16 American Board Of Family Medicine, Inc. Computer architecture and process of user evaluation

Also Published As

Publication number Publication date
AU2007361697A1 (en) 2009-06-04
WO2009068838A1 (en) 2009-06-04

Similar Documents

Publication Publication Date Title
EP1879142A2 (en) Virtual human interaction system
Fealy et al. The integration of immersive virtual reality in tertiary nursing and midwifery education: A scoping review
Bong et al. Tangible user interface for social interactions for the elderly: a review of literature
Meskó The impact of multimodal large language models on health care’s future
Damar What the literature on medicine, nursing, public health, midwifery, and dentistry reveals: An overview of the rapidly approaching metaverse
Kenny et al. Virtual humans for assisted health care
Meuschke et al. Narrative medical visualization to communicate disease data
Zhou et al. Virtual reality as a reflection technique for public speaking training
Hah et al. How clinicians perceive artificial intelligence–assisted technologies in diagnostic decision making: Mixed methods approach
Moore et al. Exploring user needs in the development of a virtual reality–based advanced life support training platform: exploratory usability study
Bhowmick et al. Pragati: design and evaluation of a mobile phone-based head mounted virtual reality interface to train community health workers in rural India
Roma et al. Medical device usability: literature review, current status, and challenges
Hernandez Health literacy, eHealth, and communication: putting the consumer first: workshop summary
Gonzalez-Moreno et al. Improving humanization through metaverse-related technologies: a systematic review
Gross et al. Setting an agenda: results of a consensus process on research directions in distance simulation
Pillay et al. The power struggle: Exploring the reality of clinical reasoning
Zechner et al. NextGen Training for Medical First Responders: Advancing Mass-Casualty Incident Preparedness through Mixed Reality Technology
AU2007361697B2 (en) Virtual human interaction system
Brammer et al. Developing Innovative Virtual Reality Simulations to Increase Health Care Providers' Understanding of Social Determinants of Health
AU2013206341A1 (en) Virtual human interaction system
Birns et al. Development of a novel multimedia e-learning tool for teaching the symptoms and signs of stroke
Joekes 82 Breaking Bad News
Kantz et al. Language functions and medical communication: the human body as text
Fickenscher et al. Education in Virtual Care Delivery: Clinician Education and Digital Health Literacy
Roughley et al. Cystic Fibrosis: A Pocket Guide

Legal Events

Date Code Title Description
FGA Letters patent sealed or granted (standard patent)
MK14 Patent ceased section 143(a) (annual fees not paid) or expired