US20140127662A1 - Computerized medical training system - Google Patents

Computerized medical training system Download PDF

Info

Publication number
US20140127662A1
US20140127662A1 US13/924,205 US201313924205A US2014127662A1 US 20140127662 A1 US20140127662 A1 US 20140127662A1 US 201313924205 A US201313924205 A US 201313924205A US 2014127662 A1 US2014127662 A1 US 2014127662A1
Authority
US
United States
Prior art keywords
user
medical
patient
avatar
physician
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/924,205
Inventor
Frederick W. Kron
Noah Falstein
Stacy Marsella
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Medical Cyberworlds Inc
Original Assignee
Medical Cyberworlds Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US83018306P priority Critical
Priority to US11/776,978 priority patent/US8469713B2/en
Application filed by Medical Cyberworlds Inc filed Critical Medical Cyberworlds Inc
Priority to US13/924,205 priority patent/US20140127662A1/en
Assigned to MEDICAL CYBERWORLDS, INC. reassignment MEDICAL CYBERWORLDS, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: FALSTEIN, NOAH, MARSELLA, STACY, KRON, FREDERICK W., DR.
Publication of US20140127662A1 publication Critical patent/US20140127662A1/en
Application status is Abandoned legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances
    • G09B5/06Electrically-operated educational appliances with both visual and audible presentation of the material to be studied
    • G09B5/065Combinations of audio and video presentations, e.g. videotapes, videodiscs, television systems
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B23/00Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes
    • G09B23/28Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances
    • G09B5/02Electrically-operated educational appliances with visual presentation of the material to be studied, e.g. using film strip

Abstract

A method of medical training includes presenting a user with a medical scenario within a medical simulation in which the user plays a physician. The medical scenario includes an interaction between the user and a patient. Performance data corresponding to the user is identified. The identified performance data is based at least in part on an action of the user during the interaction between the user and the patient. The user is evaluated based at least in part on the identified performance data to determine whether the user has achieved a training goal within the medical simulation. The training goal is intended to improve a medical skill of the user.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application is a continuation of U.S. application Ser. No. 11/776,978, filed 12 Jul. 2007, now pending, which claims the benefit of U.S. provisional application No. 60/830,183, filed 12 Jul. 2006. The foregoing are hereby incorporated by reference as though fully set forth herein.
  • FIELD
  • The subject of the disclosure relates generally to medical training. More specifically, the disclosure relates to a method, system, and computer-readable medium for teaching empathy, patient-centeredness, professionalism, interviewing micro-skills, and communication skills (e.g., bedside manner) through constructivist learning in a social, three-dimensional environment with emotionally expressive avatars.
  • BACKGROUND
  • During medical school, medical students learn human anatomy, disease symptoms, disease stages, diagnostic techniques, treatment techniques, and other scientific information needed to think critically about the diagnosis and treatment of illnesses. However, being a good physician requires more than just scientific knowledge. A physician also needs to be able to form and maintain effective relationships with his/her patients. This can be challenging because each patient is a unique individual that can differ from other patients in appearance, background, experience, educational level, cognitive ability, religion, attitude, ethnicity, etc. To form effective relationships with such a wide array of patients, the physician must be patient, perceptive, understanding, supportive, and empathic. In addition, the physician needs to be able to portray his/herself as a knowledgeable and trustworthy individual.
  • Present-day medical curriculums focus on conveying scientific knowledge and do not adequately train physicians in patient interactions. This lack of curricular emphasis on medical humanism and the lack of proper integration of sociological and psychological information into medical training results in physicians who do not have the necessary tools to understand patients as people, to effectively convey information to patients, or to effectively listen to patients. Even in medical schools in which an effort to teach medical humanism is made, the effort is largely ineffective. Books and lectures do not even begin to expose medical students to the wide array of situations in which physicians find themselves during their day-to-day practice. Further, books and lectures do not provide medical students with any experience in interacting with patients, colleagues, medical staff, pharmacists, superiors, insurance company representatives, or other individuals involved in the medical profession. Books and lectures are further limited in their ability to help medical students understand why certain information is critical to acquire, and what the consequences are if that information is ignored.
  • As a result of the above-mentioned curricular deficiency, the majority of physicians are unable to effectively communicate with or otherwise relate to their patients. Poor physician/patient relationships can lead to misdiagnosis and/or other medical errors. Medical errors result in approximately 200,000 patient deaths each year, more than the number of individuals who die from motor vehicle accidents, breast cancer, and AIDS combined. Medical errors are also the primary basis for medical malpractice claims brought against physicians. In addition, poor physician/patient communication can lead to uninformed patients, misinformed patients, unhappy patients, patients who are unable or unwilling to adhere to a prescribed treatment, and/or patients who reject the medical profession.
  • Another problem in the medical profession is that most physicians enter practice with no concept of medical economies, medical policies, good business practices, or good management practices. This again stems from an inability to effectively teach these skills in a traditional medical school classroom. Yet another problem in the medical profession is the lack of consistency in medical practices across professional communities. Medical practices are inconsistent from region to region, from state to state, from medical school to medical school, and even among different faculty at the same medical school. Inconsistent medical practices can make it difficult for physicians to transfer locations and/or work with physicians in other geographic regions. Inconsistencies can also make a difficult and sometimes frustrating profession even more frustrating for new medical school graduates. In addition, it can be especially difficult for medical students because inconsistencies exist not only between regions and individuals, but also between the attitudes and practices that medical students observe and the values that are explicitly taught to them. This so-called ‘hidden curriculum’ (i.e., the social and cultural aspects of education that exist alongside an educational institution's stated or intended curricular objectives) creates a huge problem for medical students as they try to develop an ethical and reflective style of practice.
  • Thus, there is a need for a medical training system capable of teaching medical personnel how to effectively interact with patients through actual experience. There is also a need for a medical training system capable of teaching medical personnel how to effectively interact with other physicians, assistants, staff, billing coordinators, and any other personnel associated with the medical profession. There is also a need for a medical training system capable of effectively teaching medical personnel about medical economics, medical policy, good business practices, and good management practices. Further, there is a need for a medical training system capable of consistently training a large number of medical personnel such that medical practices are able to become more consistent throughout the medical profession and conform more closely to the humanistic, patient-centered values that are espoused.
  • SUMMARY
  • An exemplary method of medical training includes presenting a user with a medical scenario within a medical simulation in which the user plays a physician. The medical scenario includes an interaction between the user and a patient. Performance data corresponding to the user is identified. The identified performance data is based at least in part on an action of the user during the interaction between the user and the patient. The user is evaluated based at least in part on the identified performance data to determine whether the user has achieved a training goal within the medical simulation. The training goal is intended to improve a medical skill of the user.
  • An exemplary computer-readable medium has computer-readable instructions stored thereon that, upon execution by a processor, cause the processor to maintain a medical simulation. The instructions are configured to present a user with a medical scenario in which the user plays a physician. The medical scenario comprises an interaction between the user and a patient. Performance data corresponding to the user is identified. The identified performance data is based at least in part on an action of the user during the interaction between the user and the patient. The instructions are further configured to determine, based at least in part on the identified performance data, whether the user has achieved a training goal within the medical simulation. The training goal is intended to improve a medical skill of the user.
  • An exemplary system for medical training includes a medical training engine, a memory, and a processor. The medical training engine includes computer code configured to generate a medical scenario within a medical simulation. The medical scenario is presented to a user, where the user plays a physician within the medical scenario. Performance data based on an action of the user within the medical scenario is identified. The computer code is further configured to determine, based at least in part on the identified performance data, whether the user has achieved a training goal within the medical simulation. The training goal is intended to improve a medical skill of the user. A status of the user is increased within the medical simulation if the user has achieved the training goal. The memory is configured to store the medical training engine. The processor is coupled to the memory and configured to execute the medical training engine.
  • Other principal features and advantages will become apparent to those skilled in the art upon review of the following drawings, the detailed description, and the appended claims.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Exemplary embodiments will hereafter be described with reference to the accompanying drawings.
  • FIG. 1 is a flow diagram illustrating operations performed by a medical training system in accordance with an exemplary embodiment.
  • FIG. 2 is a medical training system in accordance with an exemplary embodiment.
  • FIG. 3 is a diagram illustrating components of a medical training system in accordance with an exemplary embodiment.
  • FIG. 4 is a diagram illustrating a physician office in accordance with an exemplary embodiment.
  • FIG. 5 is a diagram illustrating a reception area in accordance with an exemplary embodiment.
  • FIG. 6 is a diagram illustrating a corridor within the medical facility in accordance with an exemplary embodiment.
  • FIG. 7 is a diagram illustrating an exam room in accordance with an exemplary embodiment.
  • FIG. 8 is a diagram illustrating a virtual conference room in accordance with an exemplary embodiment.
  • DETAILED DESCRIPTION
  • Described herein is a training system adapted to teach users how to successfully interact with other individuals. In an exemplary embodiment, the training system (or system) can be an on-line simulation in which users can interact with one another and participate in scenarios. The system can also refer to any of the hardware and/or software which is used to implement the simulation. The system can be implemented by any programming method known to those of skill in the art. In an exemplary embodiment, the on-line simulation can utilize massively multi-player on-line gaming (MMOG) structures, procedures, and methodologies. As such, the simulation can take place in a distributed, three-dimensional synthetic environment. The synthetic environment, which may include game play design elements, can also include avatars. As used herein, an avatar can refer to a simulated character controlled by a live user or a computer controlled (i.e., virtual) character within the simulation. Avatars controlled by the computer can feature cognitive and emotional modeling such that their behavior is highly realistic and human-like. The synthetic environment, which may include game play design elements, can also include realistic avatars that feature cognitive and emotional modeling. As used herein, an avatar can refer to a simulated character controlled by a live user or a computer controlled (i.e., virtual) character within the simulation. These features can combine to create a naturalistic, highly social, and constructivist learning environment in which users can experientially learn values, empathy, patient and/or client-centeredness, professionalism, interviewing micro-skills, communication skills, etc. The simulation can also utilize virtual reality technology to further enhance the realism and impression of the simulation.
  • As described herein, the system can be used to teach medical students how to interact and form relationships with patients and/or any other individuals whom they may encounter in their role as physicians. The system can also be used by practicing physicians to obtain continuing medical education credits, by medical school professors to interact with and test their students, by medical staff for training purposes, by patients who want to form better relationships with their physicians, and/or by the general public as a diversion. While the training system is primarily described with reference to training in the medical field, it is important to understand that the system is not so limited. Using the same principles described herein, the training system can be applied to any field/profession in which individuals are required to interact and form relationships with other individuals. For example, the training system can be used in the legal field to teach law students and attorneys how to effectively interact with their clients. Similarly, the training system can be used to teach nurses, dentists, business managers, coaches, teachers, receptionists, customer service specialists, bankers, etc. how to interact with and treat their clients, students, and/or employees.
  • FIG. 1 is a flow diagram illustrating operations performed by a medical training system in accordance with an exemplary embodiment. Additional, fewer, or different operations may be performed in alternative embodiments. In an operation 100, the medical training system can receive registration information from a user. The registration information can include the user's name, contact information, profession, level of training, etc. The registration information can be received by any method known to those of skill in the art. In one embodiment, the system can also receive a monetary fee from the user during registration. The monetary fee can be paid directly by the user, by the medical school which the user attends, by the user's employer, or by any other entity. In an exemplary embodiment, the user can be a medical student, and the registration process can be performed when the medical student registers for a class which utilizes the medical training system as an instructional tool.
  • In an operation 105, the system provides the user with access to a medical simulation. The medical simulation can be a virtual world including one or more medical settings in which physicians can interact with patients and other individuals. The medical settings can be any locations, such as a clinic, a hospital, a patient's home, a nursing home, a hospice facility, etc., in which a medical practitioner may be called upon to practice medicine. The medical simulation can also include homes, stores, vehicles, banks, governments, etc. in both urban and rustic settings to make the medical simulation more realistic. Access to the medical simulation can be provided through medical simulation software which the user downloads and installs on his/her computer. Alternatively, access can be provided through a website upon which the medical simulation can be accessed. Alternatively, access to the medical simulation can be provided by any other method known to those of skill in the art. In one embodiment, different versions of the medical simulation can be used to accommodate different types of users. A first version can be for medical student users, a second version can be for practicing physician users, and a third version can be for the general public. Alternatively, a single version can be used to accommodate all system users.
  • In an exemplary embodiment, the medical simulation can be an on-line world which is accessible through a network such as the Internet. As such, multiple users can interact with and learn from one another. FIG. 2 is a medical training system 200 in accordance with an exemplary embodiment. Medical training system 200 includes a medical training server 205 connected to a network 210. Medical training server 205 can be a computer including a memory, a processor, and input/output ports. As such, medical training server 205 can be capable of maintaining one or more medical simulations, receiving data from users within the one or more medical simulations, sending data to users within the one or more medical simulations, and/or storing data corresponding to the users and/or the one or more medical simulations. Network 210 can be any network upon which information can be transmitted as known to those of skill in the art. A first user device 215, a second user device 220, and a third user device 225 can be in communication with medical training server 205 through network 210. First user device 215 can be a desktop computer of a first user, and second user device 220 can be a laptop computer of a second user. Third user device 225 can be a cellular telephone, personal digital assistant, or any other electronic device of a third user which is capable of communicating with medical training server 205 through network 210. In an alternative embodiment, the system may communicate with users through peer-to-peer networking, and medical training server 205 may not be used.
  • Referring back to FIG. 1, in an operation 108, the system receives personal information regarding the user. The personal information can be any information which can be used to evaluate the user's medical experience, medical skills, strengths, weaknesses, and/or computing experience. The personal information can be used to help design an avatar for the user, to determine appropriate training goals for the user within the medical simulation, and/or to determine appropriate medical scenarios in which the user should participate. As an example, the user can indicate that his/her biggest fear is conveying bad news, such as telling the patient that he/she has a chronic illness, that his/her therapy has failed, that he/she is going to die, etc. To help the user overcome this fear, the user can be placed in medical scenarios with patients who need to be told that they suffer from a chronic illness. The personal information can also include psychometric data of the user. The psychometric data may be used to condition interactions of the user within the medical simulation.
  • In an exemplary embodiment, the personal information can be received directly from the user. Alternatively, the personal information can be received from the user's professor, colleagues, patients, etc. In an alternative embodiment, the personal information can be obtained during the registration process described with reference to operation 100. In another alternative embodiment, the personal information can be received as part of a medical simulation tutorial presented to the user. Alternatively, the medical simulation tutorial may be presented to the user only if the user indicates that he/she has a lack of computing experience or lack of experience within virtual worlds.
  • In an operation 110, an avatar is established for the user. The avatar can be a digital representation of the user through which he/she is able to interact in the medical simulation with the avatars of other users and avatars generated and controlled by the system. In one embodiment, the user can select his/her avatar such that the avatar is similar in appearance to the user. For example, if the user is a 25 year old, 5′4″ white female with blond hair, she can select an avatar with those characteristics. Alternatively, the user may be allowed to select an avatar with any other of a normal range of human characteristics. In an alternative embodiment, the avatar may be assigned to the user by the system or a training administrator such as a medical school professor. For example, a female user may be assigned a male avatar such that the female user can experience how various patients respond to a male physician and/or how various physicians respond to a male patient. In another alternative embodiment, the system can assign an avatar to the user based on the received personal information regarding the user.
  • In an exemplary embodiment, the user can have at least one avatar for each role that he/she is to play within the medical simulation. For example, the user can select (or be assigned) a physician avatar for his/her role as a physician, and a patient avatar for his/her role as a patient. In another exemplary embodiment, the patient avatar can be generated by the system to exhibit specific patient traits. In addition, the patient avatar may not be specific to the user such that a plurality of users can use the same patient avatar either simultaneously or at different times within the medical simulation. In an alternative embodiment, the user can have a single avatar for all of his/her roles within the medical simulation.
  • In an exemplary embodiment, the established avatar can have a status and/or characteristics which may change over a period of time. For example, a new user, such as a first year medical student who is to play the role of a physician, can have a beginner status. As such, the first year medical student's avatar can exhibit characteristics typical of a novice physician. For example, the avatar can appear hesitant, unsure, or intimidated when interacting with a patient. These characteristics can be exhibited through facial expressions of the avatar, body language of the avatar, the dialog of the avatar, and/or the general appearance of the avatar. Once the first year medical student has passed an exam, achieved a goal, or otherwise proved his/herself within the medical simulation, the status and/or characteristics of his/her avatar can be upgraded to reflect a more experienced physician. For example, as a second year medical student, the user can have an intermediate status such that his/her avatar appears and/or acts more calm and professional within the medical simulation. Other aspects of an avatar which may change over time can include the avatar's clothing, the avatar's title (i.e., intern, resident, new doctor, senior doctor, etc.), the avatar's prestige, the avatar's trustworthiness, the avatar's friendliness, etc.
  • In an alternative embodiment, the status and/or characteristics of the avatar can be based on actual characteristics of the user. For example, the user can be an impatient person who tends to get upset when a patient does not understand an instruction or explanation. The avatar of the impatient user can likewise exhibit the user's impatient behavior through body language and facial expressions. While interacting with patients within the medical simulation, the user can view his/her avatar and see first hand how the impatient behavior adversely affects the patient relationship. In an exemplary embodiment, characteristics to be attributed to a user's avatar can be identified by the user, a medical school professor who teaches the user, the user's colleagues in the medical practice, feedback from actual patients of the user, etc.
  • In an operation 115, a goal is provided to the user. In an exemplary embodiment, the goal can be a basis within the medical simulation by which the user is able to elevate the status and/or characteristics of his/her avatar. For example, the goal may be to have a 90% or higher patient satisfaction rate within the medical simulation over a specified period of time. Alternatively, the goal may be to overcome an adverse characteristic of the user such as hesitancy, impatience, arrogance, condescendence, etc. These characteristics can be operationally defined within the medical simulation such that they are amenable to modification. The goal can also be to improve a patient's attitude, to convince a patient to adopt a healthier lifestyle, to convince a patient to follow a treatment schedule, or otherwise improve a patient's health. The goal can also be to successfully manage a medical facility with a given budget, to learn how to admit patients as a receptionist, to learn how to deal with grieving family members of a patient, to accurately diagnose patients, to handle a false accusation by a patient, to apologize for a mistake, etc.
  • In one embodiment, users can be provided with one or more short term goals, one or more medium term goals, and/or one or more long term goals. For example, in the role of a physician, the user may have a short term goal of convincing a hypertensive patient to comply with his/her medication regimen. The user may also have a medium term goal of obtaining a patient satisfaction rating of at least 85% for a physician/patient interaction which occurs over a plurality of office visits and over an extended period of time. Patient satisfaction can be determined based on dialog used by the physician, feedback from the patient, and/or an evaluation from a professor or other observer of the interaction. The patient can be a live user or a computer-generated virtual patient. Feedback can be obtained directly from the patient user, generated by the system, and/or provided by one or more third parties who view the interaction. The user may have a long term goal of passing a set of standardized exams at the end of a course which features the medical training system as a learning tool.
  • In the role of a patient, the user may be a respected businessman who is addicted to drugs. The patient can have a short term goal of convincing a physician to issue a drug prescription without divulging his illegal drug use. The patient can have a medium term goal of getting referred to an out of town specialist (contrary to his insurance rules) for a seemingly simple infection. The patient can have a long term goal of seeking treatment for what he suspects may be AIDS without divulging his drug use or the fact that he has shared needles with others. A patient goal can also include convincing the physician that the patient is healthy such that the patient is relieved from thinking about his/her illness. As such, the ‘victory’ conditions for the physician and for the patient may conflict with one another. These types of patient goals can be used as a tool to inform physicians about the potentially obscure, counter-intuitive, and seemingly counterproductive motivations of patients. The patient goals can also reveal the necessity of taking a broad, psychobiosocial view of the patient such that the user can completely understand and adequately address the patient's true concerns.
  • In an exemplary embodiment, upon completion of a goal, the user's status within the medical simulation can improve. The user's status can be improved by enhancing his/her avatar, by increasing his/her salary within the medical simulation, by giving him/her a promotion, or otherwise advancing/enhancing the professional status of the user. Alternatively, the user's training within the medical simulation can be completed when a goal is achieved. In another exemplary embodiment, the goal(s) for the user can be established by the system, by a medical professor, by colleagues of the user, by the user, by patients, and/or by a medical examination board. In one embodiment, the goal(s) can be established based at least in part on the received personal information regarding the user.
  • In an operation 120, the user is presented with a medical scenario within the medical simulation. In an exemplary embodiment, the medical scenario can be a physician/patient interaction in which the user plays the role of either the physician or the patient. Alternatively, the medical scenario can be an interaction between a physician and relatives of a patient who has undergone surgery, been diagnosed with a terminal illness, passed away, or been successfully treated. The medical scenario can also be an interaction between an emergency medical technician and an accident victim at the scene of an accident, an interaction between an emergency room desk attendant and an individual seeking admittance into the emergency room, an interaction between a medical facility administrator and a bill collector, an interaction between medical personnel and an insurance company, or any other interaction which can occur in the medical profession. Medical scenarios are described in more detail with reference to FIG. 3.
  • In an operation 125, the system identifies performance data based on an act of the user during the medical scenario. The performance data can be any information related to the user's behavior during the medical scenario which can be used to grade, score, view, or otherwise evaluate the user. For example, the performance data can be an audio and/or video capture of the medical scenario which is capable of being replayed from any point of view by an evaluator. The performance data can also be timing information based on occurrences within the medical scenario, accuracy data based on statements made by the user during the medical scenario, accuracy data based on a diagnosis made by the user, accuracy data based on a test run on a patient by the user, feedback from any participants in or viewers of the medical scenario, professor comments, decisions made by the user, metrics regarding the user's style of interaction, etc. The performance data can also include eye movements of the user. For example, a camera can be used to track the user's eye movements during a medical scenario to determine whether the user is making eye contact with a patient, looking down at the ground, rolling his/her eyes, etc. Similarly, in embodiments in which speech is used, the performance data can include voice analysis data. For example, a microphone and a speech analyzer can be used to detect a stress level of the user, a nervousness level of the user, a tone of voice (i.e., friendly, hostile, etc.) of the user, and so on. In one embodiment, a camera may be used to capture the user's eye movements, facial expressions, body language, and other body movements. The captured movements/expressions can be attributed to the user's avatar in real time such that the avatar mimics what the user is doing in the real-world. Any or all of the captured movements/expressions can also be used as performance data to evaluate the user.
  • As an example, a medical scenario can be an interaction between a patient and a physician in an examination room of a medical clinic. The physician can be a first user, and the patient can be a second user or a computer controlled patient, depending on the embodiment. The physician can elicit symptom information from the patient, perform tests on the patient, access medical literature, make a diagnosis based on the symptom information, provide the diagnosis to the patient, and/or recommend one or more treatment options to the patient. The performance data can be based on the physician's sincerity when speaking with the patient, questions asked by the physician, the physician's level of seriousness, the physician's responses to questions asked by the patient, the physician's handling of phone calls or pages during the patient examination, the level of trust which the physician engendered in the patient, the punctuality of the physician, the physician's ability to detect falsehoods from the patient, and/or the responsiveness of the physician. Physician characteristics such as sincerity, trust, and seriousness can be operationally defined such that they can be measured and used as part of the user's evaluation. The performance data can also be based on the accuracy of the physician's diagnosis, the accuracy of tests run by the physician, the accuracy of the treatment recommended by the physician, etc. In an exemplary embodiment, performance data can also be captured based on the patient's actions during medical scenarios in which the patient is also a user. Performance data of a patient can be used to ensure that the patient is acting realistically, is responding appropriately, and is not in collusion with the physician.
  • In an operation 130, the user is evaluated based on the performance data. In an exemplary embodiment, the user can be evaluated (or assessed) by a professor, a superior, a medical board, a testing agency, or any other individual(s) associated with the user. Alternatively, the user can be evaluated by the system. The evaluation, which can be objective or subjective, can be based on any criteria established by the evaluator(s). In one embodiment, the user can be evaluated by comparing his/her handling of a medical scenario to the handling of the medical scenario by one or more medical experts. The user can be evaluated after each medical scenario in which the user participates, after participation in a predetermined number of medical scenarios, at an established time such as the end of a semester, and/or randomly. In one embodiment, the user can be evaluated to determine whether the user has achieved the goal provided to the user in operation 115.
  • In an exemplary embodiment, regular users such as medical student users may be limited in their ability to access and/or alter archived information regarding occurrences within the medical simulation. Training administrators and other special users may be provided with expanded or unlimited access such that they can effectively evaluate other users. For example, a medical student user may not be able view or alter a patient interaction in which he/she participated. A professor of the medical student may be given the ability to replay the patient interaction from any point of view, add commentary to the patient interaction, and/or alter the patient interaction such that the medical student user can be shown his/her mistakes.
  • In an exemplary embodiment, access to and within the medical simulation can be broken down into categories including observer access, limited student access, standard student access, assistant instructor or privileged student access, professor access, simulation administrator access, and developer access. An observer can refer to an individual who watches others participate in the medical simulation, but who does not his/herself participate. Observer access allows the observer to follow an individual avatar in the medical simulation and view interactions of the avatar from a third person point of view. The observer is not able to communicate with the avatar, control the avatar, or otherwise affect any part of the medical simulation. The observer may be a professor who wishes to introduce his/her class to interactions within the medical simulation.
  • Users with limited student access may have access to only a limited portion of the medical simulation or only to certain avatars within the medical simulation. Alternatively, limited student access may include full access to the medical simulation for only a limited amount of time per day or per week. As an example, a limited student may be a pre-med student who is only allowed to play the role of a patient within the medical simulation. Standard student access can provide participants with the ability to fully control one or more physician avatars and one or more patient avatars within the medical simulation. The standard student can fully participate in physician/patient interactions. Assistant instructor or privileged student access can be granted to teaching instructors or exceptional students. This level of access allows the user to slightly bend the rules of the medical simulation for the sake of learning. Professor access allows the professor to start new simulation sessions, observe students, and take over student avatars. Professor access also allows the professor to edit a student's records, alter a student's access level, and alter a student's in-game abilities. Simulation administrator access allows the simulation administrator to change any variables, records, etc. regarding the medical simulation which do not require changes to the source code. Developer access allows the developer to have full and complete access to the medical simulation, including the ability to alter the source code of the medical simulation.
  • In an operation 135, feedback is provided to the user. The feedback can be generated by the system, provided by an evaluator, and/or provided by a participant in a medical scenario in which the user participated. The feedback can be in the form of a score, grade, comment, or any other indicator of the user's performance within the medical simulation. In one embodiment, feedback can be provided as commentary along with an audiovisual replay of the medical scenario for which feedback is being provided. Feedback can also occur in the form of negative and positive consequences within the medical simulation. For example, if the user is able to successfully convince a patient to take his/her medicine, the patient may be cured of his/her ailment. If the user is unable to convince the user to take his/her medicine, the patient may end up in a coma, and the patient's family may sue the user for medical malpractice. As such, the medical simulation provides users with an experiential, constructivist learning model. Feedback can be provided to the user at any time during or after the medical scenario. In one embodiment, feedback can be instantly provided by the system or a monitor when the user makes a serious or fatal mistake during a medical scenario.
  • Feedback in the form of consequences within the simulation provides users with experiential learning. In addition to consequences resulting from physician/patient interactions, the consequences may also result from a variety of user actions and choices within the simulation. For example, a user who discovers and explores a Korean neighborhood within the simulation, may develop a better understanding of Korean culture, Korean attitudes towards life and death, family relationships, the use of herbs and other complementary medical techniques by Koreans, etc. The consequence of this discovery may in turn impact an interaction with a Korean patient that the user is treating. As another example, a user may decide to do a home visit for a seemingly noncompliant diabetic patient and discover that the patient lives in a homeless shelter. As a result, the system may guide the user such that he/she learns about health care options for the indigent or how/where to obtain free medication for the indigent such that the patient can be successfully treated. The home visit may even lead the user to attempt to effect political change within the simulation by creating or expanding a free clinic for the indigent. It can thus be appreciated that the simulation is dynamic and able to provide a user with challenges based on areas in which the system perceives that the user needs improvement.
  • In an operation 140, a decision is made regarding whether the user achieved the provided goal. If the user has not achieved the provided goal, the system can present the user with another medical scenario in operation 120, and the process can be repeated until the user achieves the goal. If the user has achieved the provided goal, the system can increase the user's status within the medical simulation in an operation 145. The user's status can be increased by increasing the user's skill level(s), increasing the user's salary, enhancing the user's avatar, increasing the professional prestige of the user, increasing the medical skill set of the user, lowering a malpractice rate of the user, increasing the number of patients of the user, and/or otherwise advancing the user. In an exemplary embodiment, the user can be provided with a new goal in operation 115, and the process can be repeated until the user achieves the new goal. Alternatively, upon achieving the provided goal, the user may be finished with his/her training within the medical simulation.
  • FIG. 3 is a diagram illustrating components of a medical training system 300 in accordance with an exemplary embodiment. Additional, fewer, or different components can be included in alternative embodiments. Medical training system 300 can include a medical setting engine 305, a medical scenario engine 310, a reference engine 315, a diagnosis engine 320, a consultation engine 325, a capture engine 330, an assessment engine 335, a financial engine 340, an external engine 345, and a personal life engine 350.
  • In an exemplary embodiment, medical setting engine 305 can be used to provide and maintain a medical setting in which medical scenarios can take place. The medical setting can be a medical clinic, a hospital, an emergency room, a patient's home, an accident site, a natural disaster site, a private practice, or any other location in which a medical practitioner may be called upon to practice medicine. Alternatively, the medical setting can be any other location in which a physician, patient, or medical staff member may go during the course of practicing medicine or seeking medical treatment. In an exemplary embodiment, medical setting engine 305 can provide a plurality of medical settings such that the medical simulation is more realistic. For example, a physician may have a first appointment at a first clinic at 2:00 pm, a second appointment at a second clinic at 3:30 pm, and a third appointment at a hospital at 5:00 pm. In an exemplary embodiment, the medical settings can be part of a seamless continuum of the simulated world. Alternatively, medical settings may occur as discrete, single user or group instances within the simulated world.
  • In an exemplary embodiment, at least one medical setting within the medical simulation can be a clinic. The clinic can include a reception desk with a receptionist, a patient waiting area, a screening area, a nursing station, a plurality of physicians' offices, a plurality of patient examination rooms, a clinic manager's office, a patient check-out area, an on-site laboratory, etc. In one embodiment, the receptionist, janitors, and other staff members can be computer controlled simulations. Alternatively, any or all of the staff members can be played by system users. For example, the receptionist can be a user, and medical training system 300 can be used to train the user such that he/she becomes accustomed to handling patients, collecting insurance information, taking phone calls, scheduling follow up visits, and/or performing any other tasks expected of a receptionist.
  • In another exemplary embodiment, the clinic can also include a plurality of virtual computer terminals through which an electronic health record (EHR) system can be accessed by physicians. Physicians can use the EHR system to access a patient's personal information, insurance information, laboratory reports, x-ray data, consultant reports, and/or medical history. Physicians can also use the fully functional EHR system to order laboratory tests, to enter prescriptions, to obtain, complete, and/or dispatch standard forms (i.e., back-to-work, disability, school notes, etc.), to automatically set up ‘tickler’ files to remind the physician when patient studies and follow-up visits are due, to collect and/or analyze information regarding populations of patients with chronic conditions, etc. The EHR system may provide full coding features for billing purposes. The EHR system can also include a virtual conference feature such that users can form care teams and work in groups. The virtual conference feature can allow a plurality of users to simultaneously review patient information, consult an expert, hold a roundtable discussion, or otherwise communicate with one another. In one embodiment, the EHR system may include information regarding a real-world patient, and the virtual conference feature may be used by real-world physicians to communicate about treatment of the patient.
  • The computer terminals within the simulation can also be information portals through which physicians can practice medicine and/or learn about the medical field. Users can also use the computer terminals to acquire timely information about their patient(s), their colleagues, or any psychobiosocial topic that is relevant to the challenging problems within the simulation. The information may be provided by content experts in a synopsized, contextualized form. The information may also be provided through links to a simulated or real-world biomedical library. In one embodiment, each examination room within the clinic can include a virtual computer terminal (i.e., information portal) such that the physician can retrieve and/or enter data as the physician is examining the patient.
  • The clinic can also have a flag system in place such that users are provided with visual cues indicating the status of rooms within the clinic. For example, a blue flag outside of an exam room can indicate that a nurse is prepping a patient for a visit, a green flag can indicate that a patient is waiting to be seen, a yellow flag can indicate that a physician is in the exam room with the patient, a red flag can indicate that the physician is finished with the patient and the patient is ready for checkout, a white flag can indicate that lab tests are needed for the patient, a black flag can indicate that x-rays are needed for the patient, a brown flag can indicate that the exam room needs to cleaned and prepared for the next patient, and so on. In addition, the clinic can be ergonomically sound, wheelchair accessible, in compliance with any applicable real-world building and/or design codes, and decorated with calming colors and decor that reflects the current standard in intelligent real-world clinic design.
  • Medical scenario engine 310 can be used to implement a medical scenario within one of the medical settings. In an exemplary embodiment, the medical scenario can be an interaction between a physician and a patient, and the roles of the patient and/or the physician can be played by users. Alternatively, the roles of the patient and/or the physician can be played by computer controlled characters. Such computer controlled characters can be designed to incorporate cognitive and emotional systems such that they behave like human physicians and patients and are as believable in their roles as human physicians and patients. In an alternative embodiment, the medical scenario can be any other interaction between patients, physicians, medical staff, and/or other individuals who may be encountered during the day-to-day practice of medicine. For example, the medical scenario can be an interaction between the physician and his/her superior, an interaction between the physician and the head of accounting at his/her place of employment, an interaction between the physician and a family member of the patient, an interaction between the physician and a malpractice attorney, an interaction between the physician and a subordinate, an interaction between the physician and an insurance representative, an interaction between the physician and a pharmacist, an interaction between the patient and a receptionist, an interaction between the patient and a family member, etc.
  • The scenarios encountered by users of the simulation are not limited to interactions within the medical clinic or facility. Further, the scenarios are not limited to interactions with other users or computer controlled avatars. The scenarios can take place in whole or in part at any location within the simulation, and may include any character or three-dimensional construct within the simulation/synthetic world. As an example, a scenario may include a user going to a patient's home to determine whether the patient's walls contain asbestos which may be affecting the patient's health. The scenario may also include going to the patient's virtual neighborhood to determine whether there is pollution present, or to see whether the patient lives near a nuclear power plant, a mosquito-infested pond, or a moldy hay barn. In an exemplary embodiment, augmented reality devices such as personal digital assistants (PDAs), pagers, cellular telephones, etc. can be used in the scenarios to introduce real-world elements and/or tasks into the simulation.
  • In an exemplary embodiment, the medical scenarios can be in the form of virtual face-to-face interactions, virtual telephone interactions, virtual email interactions, or any other virtual communications. As an example, a physician can receive an urgent page from a pharmacist requesting clarification of a prescription while the physician is in the middle of a face-to-face consultation with a patient. Similarly, the physician can receive an angry phone call from his/her superior while the physician is in the middle of a telephone call with the pharmacist. The physician can also receive telephone calls from laboratory technicians, receive telephone calls from patients, receive letters from medical boards, receive emails from nurses, etc.
  • Medical scenario engine 310 can provide medical scenarios which are one time occurrences and/or medical scenarios which develop and continue over a period of time. As an example of a one time occurrence, a patient can visit a physician because the patient has an embarrassing rash on his/her arm. The physician can diagnose the rash, prescribe an anti-itch cream to the patient, and tell the patient to return if the rash has not disappeared within ten days. If the diagnosis and prescription were correct, the patient may not return and the physician may not have further contact with the patient.
  • As an example of a continuing medical scenario, the physician can see a recalcitrant diabetic patient who refuses to eat properly or take medicine. The physician can have a goal of changing the diabetic patient's attitude during visits which occur over a series of days, weeks, or months. The goal can also involve one or more tasks which are outside of the actual physician/patient interactions, but which are related to the physician/patient relationship. For example, the patient may not be able to afford medication. The physician may have to do research to determine how the patient can obtain free or discounted medicine. If the physician does not perform the research and provide the patient with the information, the patient on ensuing visits may become more contentious or withdrawn, may schedule visits more (or less) frequently, may show deteriorating control of his/her disease, or may transfer his/her care to another physician. Alternatively, the physician may have to visit the patient at home to determine that the patient lives in a homeless shelter, that the patient is indigent, and that the reason the patient's condition is not improving is because the patient cannot afford medication. Upon making this determination, the physician can use resources within the simulation to research how the patient can receive free or discounted medication. Medical scenario engine 310 can provide consequences based on the physician's handling of the diabetic patient. For example, if the physician does not properly handle the diabetic patient, the diabetic patient may eventually go into a coma, die, be involved in a car accident due to fainting behind the wheel, or otherwise suffer from his/her illness. However, if the physician is able to turn the diabetic patient around, the diabetic patient can go on to become a successful businessman, an accomplished athlete, or a good student.
  • In one embodiment, medical scenario engine 310 can provide the physician with a teaser before, during, or after the physician's first meeting with a patient. The teaser can be set in the future and can illustrate consequences which may result from the physician's handling of the patient. For example, the patient can be a 16 year old boy with diabetes who loves to skateboard, but refuses acknowledge his condition or take any medicine to alleviate it. Prior to the boy's first visit with the physician, the physician can be shown a teaser which portrays events that will occur three months in the future if the boy does not begin taking his medicine. The teaser can show the boy preparing to perform in a televised half-pipe skateboarding competition. The boy's family, friends, and girlfriend can be in the stands to cheer for the boy and give him support. The boy can begin his run in perfect form and the crowd can raucously applaud. Toward the end of his run, while in the middle of a difficult mid-air trick, the boy can faint and fall twenty feet, head first to the ground. An ambulance can come and rush the bleeding boy to a hospital as his family loses control. In an exemplary embodiment, use of such a teaser can help the physician become emotionally attached to the patient such that the physician sincerely cares about the patient's welfare. The use of the teaser can also provide incentive to the physician to ensure that the portrayed catastrophe does not come to fruition. The teaser can also be used to teach the physician that his/her performance in the office can have far-reaching effects on numerous lives.
  • In an exemplary embodiment, the teaser can be provided to the user in the form of audio information, video information, textual information, or any other format. In an exemplary embodiment, the dialog which occurs between participants in a medical scenario can be conveyed through a headset (or other speaker) and microphone worn by the participants in the medical scenario. For example, a patient can use his/her microphone and headset to speak to the physician and hear the actual voice of the physician. Similarly, the physician can use his/her microphone and headset to speak to the patient and hear the actual voice of the patient. In one embodiment, full natural speech language processing may be used, and the user may be able to speak to a computer controlled avatar about any topic within the simulation. Alternatively, domain specific speech recognition may be used such that the user can only speak with the computer controlled avatar about a limited number of topics. For example, if medical domain specific speech recognition is used, the user can converse with a computer-controlled patient about a host of medical topics. However, the user may not be understood by the system if he/she attempts to talk with the computer-controlled patient about baseball. If the user attempts to talk baseball, the computer controlled patient may correct for this by becoming upset, asking the physician to repeat the statement, or steering the physician back to the reason for the visit. A speech analyzer can also be used to analyze the user's voice for evaluation purposes. The speech analyzer can detect nervousness, stress, tone, or any other voice characteristic known to those of skill in the art. These voice characteristics can be used to condition the user/patient interaction. In addition to speech analysis, a camera may be used to capture user movements, expressions, and body language, and the captured information can be attributed to the user's avatar in real time such that the avatar behaves as the user is behaving in the real-world. The captured information can also be used to evaluate the user, to ensure that the user takes the simulation seriously, and to condition the user/patient interaction.
  • In an alternative embodiment, the dialog which occurs between participants in a medical scenario can be scripted or canned, and in the form of dialog trees. The use of pre-selected dialog can ensure that symptom descriptions and responses to questions are appropriate, regardless of the medical knowledge of the participants. In addition, a dialog which is pre-written by medical professionals can be made to accurately reflect patients based on their age, ethnicity, emotional state, and medical condition. The dialog trees can be implemented in the form of multiple choices from which participants can select. For example, a patient can complain of a symptom through text in a dialog box. The physician can have the choice of asking for more details regarding the symptom, asking how long the symptom has persisted, excusing his/herself to consult with a colleague, or making an immediate diagnosis. The patient's response can be based at least in part on the physician's response, and so on. The patient can be a system user who is also choosing his/her dialog, or a computer controlled patient whose dialog is based on the dialog selections made by the physician.
  • Dialog trees can alleviate the need for convincing and realistic acting, and can also allow users who speak different languages to interact with one another. Medical training system 300 can include a translating engine such that each participant sees dialog text in his/her native language. Anonymity within medical training system 300 can also be maintained through the use of dialog trees. Anonymity can make it significantly more difficult for users to cheat the system by working together with other users. Dialog trees can also simplify the assessment of participants. Medical training system 300 can know which dialog choices are sub-optimal, and evaluate participants accordingly. In one embodiment, the medical training system can have a plurality of modes. A beginner mode may use dialog trees, and intermediate or advanced modes may use free form speech.
  • In an exemplary embodiment, player emotions and/or feelings can be graphically modeled during a medical scenario. Graphical models of emotions and feelings can be used in embodiments which utilize dialog trees and/or in embodiments which utilize free form speech. For example, during an interaction between a physician and a patient, the physician may be able to see a graphical representation of a trust level of the patient. The physician can also see how his/her responses to the patient affect the trust level. If the physician answers a question in a way that avoids the question, the trust level can decrease. Conversely, if the physician takes five minutes to provide a thorough explanation to the patient, the trust level can increase. In an exemplary embodiment, any other feelings and/or emotions, including anger, nervousness, fear, joy, happiness, morale, and so on, can also be graphically represented. In one embodiment, patients may be able to see a graphical model of the emotions and/or feelings of their physicians. In another alternative embodiment, graphical models can be used in medical scenarios involving interactions between a physician and his/her superior, a physician and a colleague, a patient and a receptionist, or any other medical scenarios. Alternatively, any or all physician and patient characteristics can be conveyed through body language, facial expressions, speech, etc.
  • In one embodiment, physicians may be prompted if a patient or other avatar with whom they are interacting supplies partial or misinformation, or is otherwise untruthful. The prompt can be a color indicator, a textual message, or any other type of indication. As an example, the physician may ask a patient if he gets any regular exercise throughout the course of an average day. The patient may respond that of course he does. The physician may be informed (through a text box, pop-up window, audio message, color indicator, etc.) that in truth, the only exercise that the patient gets is walking from his couch to his refrigerator. The physician can use such information to flush out the truth from the patient, to diagnose the patient, and/or to prescribe a treatment for the patient. The truthful information can also be used to allow the physician to better understand patient motivations and patient insecurities. In an exemplary embodiment, prompts regarding the truth underlying a physician/patient interaction may only be provided to novice users within a beginner mode. More advanced users may be expected to determine whether a patient is lying without any prompts or hints.
  • In one embodiment, an enhanced scenario replay may be used to provide feedback to the physician after the medical scenario. In an enhanced scenario replay, arrows or other indicators can point to body language, facial expressions, etc. of the patient that should have cued the physician to the patient's internal state. The physician can click on the arrows to view additional information regarding the specific body language, facial expression, etc. Alternatively, a live tutor or professor can go through a replay of the interaction with the user and explain any cues or other information which the user missed. In another alternative embodiment, a live tutor or professor can accompany the user within the medical scenario and stop the scenario at critical times to point out important cues/information which the user missed.
  • Reference engine 315 can be used to provide a full spectrum of psychobiosocial medical information to physicians in the medical simulation. The medical information can be provided by content experts in timely, synopsized, and contextualized form. Alternatively, the medical information may be provided through links to real-world data sources such as an online encyclopedia. The medical information can be used as a resource for diagnosing illnesses, identifying problems, recommending treatments, successfully solving challenging problems posed in physician/patient interactions, etc. Physicians can access the medical information through virtual computer terminals within the medical simulation, through a pop-up screen, or by any other method. The medical information can be also be in the form of access to biomedical journals, textbooks, and other medical publications. The medical information can also be in the form of links to medical websites. In one embodiment, the medical information which is accessible may depend on the user and/or the version of the medical simulation. For example, a user who is a member of the general public may receive links to medical websites, a user who is a medical student may receive online textbooks, and a user who is a practicing physician may receive medical journals, textbooks, and encyclopedias. The medical information provided may also depend on a level of experience of the user. For example, a novice user may be provided with detailed, complete, and easily accessible information from a single source. A more advanced user may be forced to utilize a plurality of sources to find desired information such that real-world research is emulated. In one embodiment, reference engine 315 can allow users to customize a virtual terminal or other access point such that the medical information is organized into a personal library. In an alternative embodiment, medical information may not be provided to users such that users are forced to rely on their own medical knowledge.
  • In an exemplary embodiment, medical training system 300 may not require user physicians to diagnose their patients because diagnosis is a skill which can be adequately taught in medical school. Rather, a primary focus of medical training system 300 can be to enhance the ability of physicians (or physicians in training) to successfully interact with patients. As such, once physicians ask patients the proper questions and perform the proper tests, diagnosis engine 320 can be used to provide users with a diagnosis based on the patient symptoms. Diagnosis engine 320 can be used as an alternative to, or in conjunction with reference engine 315, depending on the embodiment. In one embodiment, diagnosis engine 320 may be used only in specific versions the medical simulation, or only for specific users. For example, diagnosis engine 320 may be available for a user from the general public and a user in his/her first year of medical school, but not for a user who is a practicing physician seeking continuing medical education credits.
  • Consultation engine 325 can allow a user to consult with other users of the medical simulation, with live medical experts, with computer generated tutoring agents, etc. For example, a user may wish to speak with other users regarding how to handle a patient who is lying, a superior who is suggesting unethical conduct, a problematic pharmacist, etc. Users can also consult with live, real-world expert consultants who may variously be represented as avatars, through video, and/or through audio. As an example, a medical student user may be able to consult with one of his/her professors, a practicing physician, or a medical specialist. Other experts which may be accessible through the medical simulation can include psychologists, social scientists, medical anthropologists, population health experts, and so on. Consultations can take place in the form of a virtual chat room with actual speech, through text messages, through virtual telephone calls, or by any other method of communication. In one embodiment, the medical simulation can include a virtual conference room in which users can go to seek help, information, and guidance from other users. The virtual conference room may exist as a component of the Electronic Health Record (EHR) system within the simulation. Alternatively, the virtual conference room may be a three-dimensional construct within the simulation that exists apart from the EHR, and which features a virtual computer terminal (i.e., information portal) or other construct through which users can readily access information.
  • Capture engine 330 can capture and store data corresponding to events which occur in the medical simulation. The captured data can be performance data used to evaluate users and/or any other data which tracks users' actions within the medical simulation. In one embodiment, capture engine 330 can store all interactions which occur in the medical simulation such that the interactions can be replayed and reviewed. The replay of an interaction can be from any perspective such that a user can view facial expressions and body language of his/her avatar or such that a professor can simultaneously view all participants in the interaction. Users may also be able to view replays of medical scenarios in which they did not participate such that the users can learn from the mistakes and successes of other users.
  • In an exemplary embodiment, data stored by capture engine 330 can be used by medical training system 300, a professor, a medical board, or any other entity to grade users based on their performance during a medical scenario. Capture engine 330 may also be used to capture timing information of events which occur within the medical simulation. Timing information can include the amount of time it takes for a physician to make a diagnosis, whether the physician or patient is late for an appointment, the length of an appointment, an amount of time which the physician waits for a silent patient to speak, an amount of time spent listening to the patient, etc. Capture engine 330 can also be used to capture and store questions asked during consultation, medical reference materials, journal articles, etc. used to make a diagnosis, user actions within the simulation which occur apart from the actual physician/patient interaction, and any other information associated with the simulation.
  • Assessment engine 335 can be used by medical training system 300 to provide an assessment of users based on their performance in the medical simulation. The assessment can be based on responses made during medical scenarios, answers to questions, punctuality, responsiveness, a rate of patient satisfaction, etc. Assessment engine 335 can also be used to determine whether a user has satisfied a goal. If the user satisfies the goal, assessment engine 335 can increase the user's status, increase the user's salary, increase the user's skill level(s), promote the user, or otherwise reward the user within the medical simulation. As such, the medical simulation provides users with an experiential, constructivist learning model. In one embodiment, the assessment can be made at least in part by a professor or other evaluator of the user, and assessment engine 335 can receive assessment data from the evaluator. For example, assessment engine 335 and/or capture engine 330 can allow a professor to view, annotate, and otherwise comment upon a medical student user's performance during a medical scenario. If the user's performance was satisfactory and/or a goal was met, assessment engine 335 can increase the user's status accordingly.
  • Assessment engine 335 can also be used to determine and cause consequences within the simulation based on actions of the user within the simulation as he/she drives toward an ultimate or overarching goal. The consequences can be based on the handling/treatment of patients during patient interactions, information conveyed to patients, acts performed outside of patient interactions such as visiting the patient's home, responsiveness to patient telephone calls, the amount of effort placed in finding a solution for patients, and/or any other actions taken by the user within the simulation. As such, assessment engine 335 is a dynamic, iterative engine which can be used to alter a user's experience within the simulation by generating consequences that help the user to better understand the benefits or problems involved with the use of different strategies/techniques. The consequences can also be used to ensure that the user receives experience in areas where the user has performed poorly in the past.
  • Financial engine 340 can be used to track finances within the medical simulation. A user may be a clinic manager whose goal is to operate the clinic on a specific budget. Financial engine 340 can be used to keep track of the budget, keep track of clinic income, keep track of clinic expenditures, keep track of clinic liabilities and bills, and any other financial information of the clinic. Financial engine 340 can also be used to keep track of salary and expenses for individual users such that the user's financial status can be monitored. In one embodiment, financial status of individual users can be used in part to determine an overall quality of living for the users within the medical simulation. Financial engine 340 can also be influenced by assessment engine 335 based on the user's actions within the simulation. For example, a user's actions may impact patient satisfaction, patient waiting times, and the stress of nurses or other clinic personnel, and clinic revenue may be impacted accordingly. If patients are happy and there is a satisfied staff with low turnover, clinic revenue may be high. If patients are unhappy, they may not return, and the clinic revenue may be low. Financial engine 340 may also be used to economically represent a variety of different healthcare systems such that users can obtain experience in different settings. For example, financial engine 340 may be used to represent a free market healthcare system, any of a variety of socialized healthcare systems, or any other type of healthcare system. In one embodiment, users may be allowed to alter healthcare policy assumptions within the simulation to see how various policies affect the economics of the healthcare system. In an alternative embodiment, financial information may not be considered within the medical simulation.
  • External engine 345 can be used by medical training system 300 to model any outside influences which can potentially affect physicians and/or medical settings within the medical simulation. External engine 345 can be used to show how natural disasters, terrorist attacks, and other catastrophic events can affect operations within a medical setting. External engine 345 can also be used enforce legal obligations of physicians and the medical settings in which they work. External engine 345 can be used to implement changes in health regulations which may result from governmental or internal action. External engine 345 can also introduce the element of competition such that one or more clinics, hospitals, or private practices compete against one another for patients within a specific region. External engine 345 can also model relationships between clinics and health insurance companies, relationships between physicians and malpractice insurance carriers, relationships between clinics and professional associations, relationships between physicians and professional associations, and so on.
  • Personal life engine 350 can be used by medical training system 300 to control any aspect of the personal lives of users within the medical simulation. For example, users can have homes, families, vehicles, relatives, chores, hobbies, pets, and so on within the medical simulation. As such, users can get a feel for what it is like to live the life of a physician and/or a patient. In one embodiment, the personal life engine 350 may be used in only certain versions or only for specific users of the medical simulation. For example, detailed personal lives may be an option in a medical simulation designed for the general public, but not for a medical simulation designed for practicing physicians who are trying to obtain continuing medical education credits. In an alternative embodiment, personal life engine 350 may not be used, and the medical simulation may focus solely on the professional lives of physicians.
  • In an alternative embodiment, medical training system 300 may also include a body language engine. The body language engine can ensure that an avatar's facial expressions, posture, and other body language is appropriate to what the avatar is saying and what the avatar is feeling. For example, the body language engine can ensure that an avatar who has recently experienced a death in the family appears unenergetic, distant, and melancholy. The body language engine can also control body language based on past interactions and experiences of the avatar. For example, if a prior interaction between a patient avatar and a physician avatar was friendly and productive, the patient avatar can exhibit friendly and happy body language at the commencement of a subsequent interaction with the physician avatar. Similarly, if a prior interaction between the patient avatar and the physician avatar was disastrous, the physician avatar can exhibit nervous and slightly angry body language at the commencement of a subsequent interaction with the patient avatar.
  • In alternative embodiments, medical training system 300 can also include a registration engine, a goal engine, an emotion engine, an avatar engine, and/or any other engines which can be used to implement the medical simulation. For example, the registration engine can receive registration information from a user and provide the user with access to the medical simulation. The goal engine can create, store, assign, and/or receive goals for the user. The goal engine can also track the user's progress toward achieving the goal. The emotion engine can be used to convey emotions of the avatars based on past and present experiences of the avatars within the simulation. In the case of an avatar representing a live user, these emotions can influence dialog choices presented to the user. In the case of a computer controlled avatar, the emotions can influence dialog used by the avatar. Avatar emotions can also be conveyed non-verbally through body language, expressions, posture, textual indicators, other indicators, etc. Alternatively, emotion can be conveyed through the body language engine. The avatar engine can assign avatars to the user, receive avatar selections from the user, store avatar(s) for the user, and/or enhance the avatars as time progresses and the user increases his/her status. In an alternative embodiment, any or all of the engines described herein can be incorporated into a single medical training engine.
  • As described above, the medical training system can be used to model a wide variety of interactions which can occur among medical staff, medical management, physicians, patients, emergency medical technicians, receptionists, pharmacists, insurance companies, etc. In an exemplary embodiment, the medical training system can be used as part of a course within a medical school curriculum. The course, which can run for one or more semesters, can require students to attend class for a number of hours each week and participate in the medical simulation for a number of hours each week. During the first class, the professor can talk students through an introductory demonstration of the medical simulation and have the students set up their medical simulation accounts. The students can also be asked to establish their avatars and begin using the medical simulation. Experience within the medical simulation can be gained by performing ramp-up quests, such as making introductions to other avatars within the medical simulation, executing simple tasks within the medical simulation, pulling patient data within the medical simulation, and so on. Alternatively, students can gain experience by meeting with one or more computer controlled patients. Upon gaining a sufficient amount of experience, the student can begin to interact with live and/or simulated patients in the medical simulation. The student can also play the part of a live patient within the medical simulation. During subsequent classes, the professor can show replays of and comment upon particularly good and/or particularly bad physician/patient interactions. Professors can also answer specific questions and provide philosophical guidance regarding the medical simulation.
  • Over the course, students may be expected to increase their skills in various areas such as interviewing, listening, cultural sensitivity, ethnic sensitivity, religious sensitivity, etc. Users may start with a skill level of 1 in each of a plurality of categories. As a user advances through a structured series of encounters with physicians and patients, his/her skill level can increase based on how the user handles the encounters. The user may be required to attain a specific skill level in each category to pass and/or achieve a certain grade in the course. If the medical simulation is limited to a single class at an institution, there may be scheduled times during the week when students are encouraged or required to use the medical simulation. If the medical simulation is run across many medical schools, nationwide, or worldwide, usage times may not be scheduled because there will likely always be other available users who are playing the role of patients and physicians.
  • As an example of a session in the medical simulation, user A may log in as a physician at 4:00. User A may have a varied range of skills, e.g. level 2 in Korean cultural sensitivity, level 3 in interviewing micro-skills, and level 6 in augmented reflective listening. User A may have a scheduled appointment at 4:15 with a patient and a scheduled appointment at 4:35 with a physician. The patient can be computer controlled, or played by another user. Prior to his meeting, user A may go to his office to bring up files regarding the patient. User A can discover that the patient is a 45 year old fish market manager who is coming in for a checkup on his progress controlling a case of diabetes. User A can also be reminded that user A met with the patient a month ago and prescribed a combination of diet change and medication. Blood work done on the patient can indicate that there has not been much improvement in the patient's condition since the last visit. User A can be provided with an indication that the patient is waiting and ready in a designated examination room. User A can cause his avatar to walk to the examination room.
  • Depending on the embodiment, user A can either speak a greeting to the patient or select a greeting from one or more greeting options. The greeting options may include using the patient's name, refraining from using the patient's name, just saying hello, not using any greeting, offering a handshake, etc. If a live user is playing the role of the patient, the patient can also either speak a greeting to user A or select a greeting from one or more greeting options. The patient's greeting options may be based on the greeting selected by user A. User A can have a goal of understanding if and why the patient has or has not been compliant with his/her medical regimen. The patient may have a goal that conflicts with the goal of user A, such as convincing user A that he/she is complying with his/her diet when in reality the patient is too embarrassed to admit that he/she is confused by the diet. Similarly, the patient may have a goal of convincing user A that the patient is taking his/her medicine when in reality the patient has been laid off, has no health insurance, has only been taking one pill every third day, and is too ashamed to discuss the issue.
  • Both user A and the patient can be awarded points for their actions during the simulation. For example, user A may be awarded points for helping the patient to discuss his/her dietary noncompliance in a more honest fashion, and additional points for developing an understanding of why the patient did not follow his/her diet. Even if user A is not able to elicit an admission from the patient, user A may still be awarded points for building trust, which may result in a more honest dialog during future patient visits. In one embodiment, the dialog options presented to the users can be based on the user's skill level. For example, if user A has a high cultural sensitivity skill level, user A may be able to accurately empathize with a patient regarding the pressure to eat ethnic foods that adversely affect the patient's blood sugar, cholesterol, blood pressure, etc. User A may receive points for this technique, and may receive progressively more points if he/she can help the patient re-think his/her diet, set milestones for patient improvement, make referrals to an ethnically-savvy dietician, etc. If user A polarizes the discussion such that the patient tunes the physician out or becomes adversarial, user A may receive negative points, and adverse consequences may result. User A may also be awarded points if he is able to successfully conclude the appointment with the patient well before 4:35 such that he can prepare for and attend his next appointment. A successful conclusion may be achieved if user A does not act rushed or make the patient feel neglected or unimportant.
  • In an exemplary embodiment, a patient's health within the medical simulation can be represented along an illness trajectory, which can be seen as a two-dimensional representation of the patient's health over time. In another exemplary embodiment, the illness trajectory can be affected by dependent variables associated with patients and independent variables associated with physicians, thus creating a three-dimensional model. Dependent variables of the patient may include anxiety, resistive behaviors, and non-compliance. Independent variables of the physician may include rapport, patient-centeredness, cultural sensitivity, and communication skills The independent variables associated with the physician can affect the dependent variables of the patient, and the dependent variables can affect the illness trajectory of the patient.
  • As an example, rapport can be an independent variable associated with a physician and defensive behavior can be a dependent variable associated with a patient. If, during a physician/patient interaction, the physician exhibits good rapport, the defensive behavior of the patient may be lowered. Because the defensive behavior is lowered, the patient may more closely follow his/her medical regimen, and the patient's health may increase over time. The interactions of these variables and the resulting health implications are an expression of the three-dimensional illness trajectory model described above. Conversely, if the physician has poor rapport, the patient's defensive behavior may increase or remain unaltered, and the illness trajectory may reflect a decrease in the patient's health. Poor rapport by the physician and the resulting increase in defensive behavior can also result in other consequences within the simulation such as tasks which the physician is asked to perform, the timing of subsequent visits by the patient, the patient's behavior or attitude during a subsequent visit, whether the patient misses a visit, etc.
  • In an exemplary embodiment, independent variables associated with the physician and/or dependent variables associated with the patient may be made quantifiable through the use of surrogate markers, and measured by a statistical engine. As an example, rapport (i.e., an independent variable associated with the physician which, per se, may be nebulous) may be quantified by x factors, where x can be any value. The factors can include a type of clothing worn by the physician, a color of the clothing worn by the physician, a type of greeting used by the physician, a type of handshake used by the physician, whether the physician makes eye contact with the patient, whether the physician shows up on time for the appointment, specific words used by the physician, etc. During an interaction, the statistical engine can determine how many of the rapport related factors are demonstrated by the physician, and the physician can receive a rapport score. As an example, there may be 15 factors used to quantify rapport, and the physician may be considered to have excellent rapport if he/she demonstrates 12 or more factors during an interaction. Similarly, demonstration of 9-11 factors may be good rapport, demonstration of 5-8 factors may be average rapport, and demonstration of 0-4 factors may be bad rapport. If defensive behavior (i.e., a dependent variable associated with the patient) of the patient is dependent upon the physician's rapport, the defensive behavior can increase or decrease based on the number of rapport factors exhibited by the physician.
  • FIGS. 4-8 are diagrams illustrating an exemplary physician/patent interaction within a medical simulation. The diagrams in FIGS. 4-8 are two-dimensional depictions which are used for illustrative purposes. It is important to understand that, in an exemplary embodiment, the medical simulation may be a three-dimensional world in which three-dimensional avatars possess an extensive amount of detail, expression, cognition, emotion, and realism. Further, the three-dimensional world, which can be modeled after the real world, may include numerous avatars with varying attitudes, appearances, goals, and circumstances. FIG. 4 is a diagram illustrating a physician office 400 in accordance with an exemplary embodiment. Physician office 400 includes a door 405, a window 410, artwork 415, a couch 420, a chair 425, a desk 430, a telephone 435, and a computer 440. In alternative embodiments, physician office 400 can include any other furniture, decor, reference materials, etc. such that physician office 400 appears genuine. In an exemplary embodiment, physician office 400 can be within a clinic, hospital, or other medical facility within the medical simulation.
  • A physician avatar 445 can be played by a user. Physician avatar 445 can be a three-dimensional avatar which exhibits emotion and realistic facial expressions. Physician avatar 445 can also exhibit confidence, or a lack thereof, based on the accumulated learning experiences of the user within the medical simulation. For example, physician avatar 445 may exhibit a low confidence level and/or low self esteem if the user has performed poorly in previous physician/patient interactions. Similarly, physician avatar 445 may exhibit a high confidence level and/or high self esteem if the user has excelled in previous physician/patient interactions. Confidence level, self esteem, mood, emotion, etc. can be portrayed through realistic facial expressions, gestures, posture and/or other body language of physician avatar 445. Alternatively, any or all of confidence level, self esteem, mood, emotion, etc. can be portrayed through visible gauges, meters, or other indicators. The gauges, meters, or other indicators may be visible to only the user controlling physician avatar 445, to a subset of users, or to all users, depending on the embodiment. In an exemplary embodiment, avatar 445 can be created and/or represented through any combination of real time video capture, real time audio capture, real time motion capture, full animation, the use of animation toolsets that automate the animation process, etc.
  • In an exemplary embodiment, the user can log into the medical simulation and begin preparing to meet with a patient or perform other tasks. In one embodiment, physician avatar 445 can start in physician office 400 upon login. Alternatively, physician avatar 445 can be in a home, in an apartment, in another part of the medical facility, or anywhere else within the medical simulation upon login. If physician avatar 445 does not start in physician office 400, physician avatar 445 can walk, bike, drive, etc. its way to physician office 400. In one embodiment, use of an office such as physician office 400 may be limited to a subset of users within the medical simulation. For example, use of physician office 400 may be a reward or privilege based on experience and progress made within the medical simulation. Users without offices can be provided cubicles, common areas, or other areas which provide the functionality of an office such that the users can receive correspondence, learn about patients, prepare for patients, etc.
  • In an exemplary embodiment, physician office 400 can be a location in which physician avatar 445 can prepare to meet with patients. If physician avatar 445 has not previously met with a patient, physician avatar 445 can use computer 440 to obtain general information about the patient, the patient's medical history, and the reasons for the patient visit. In one embodiment, physician avatar 445 can use computer 440 to experience a teaser related to the patient. The teaser can provide physician avatar 445 with information regarding the patient's past, lies that the patient has told or may try to tell, and/or potential consequences which may be realized in the future if treatment of the patient is successful or unsuccessful. In addition, if physician avatar 445 has previously met with the patient, physician avatar 445 can use computer 440 to recall what occurred in previous visit(s). Physician avatar 445 can use telephone 435 to receive audio information from nurses and other physicians, to receive alerts, to speak with patients, to speak with pharmacists, to obtain assistance from other users and/or experts, etc. Physician office 400 may also include a pager, personal digital assistant, cellular telephone, or other communication devices such that physician avatar 445 can send/receive e-mails, pages, voicemails, text messages, etc. both inside the simulation and between the simulation and real world.
  • FIG. 5 is a diagram illustrating a reception area 500 in accordance with an exemplary embodiment. Reception area 500 includes a reception desk 505, a receptionist stool 510, chairs 515, a magazine stand 520, and a restrooms sign 525. In an alternative embodiment, reception area 500 can also include vending machines, restrooms, televisions, a fish tank, or any other items likely to be found in a genuine reception area. Patients checking in to the medical facility can wait in chairs 515 and/or read magazines from magazine stand 520 while waiting to be called to an exam room. The magazines can include links to real-world magazines, health related websites, medical information, or any other information. A receptionist avatar 530 is seated on receptionist stool 510 to help patients check in to the medical facility. In an exemplary embodiment, receptionist avatar 530 can be a computer controlled avatar. Alternatively receptionist avatar 530 can be controlled by a user. Regardless of whether receptionist avatar 530 is computer or user controlled, receptionist avatar 530 can exhibit realistic facial expressions and emotion, including impatience, friendliness, sincerity, anger, etc.
  • A patient avatar 535 can check in to the medical facility with the assistance of receptionist avatar 530. In an exemplary embodiment, patient avatar 535 can be played by a user. During check in, patient avatar 535 can provide personal information, insurance information, billing information, or any other information which is generally provided upon checking into a medical facility. In one embodiment, if patient avatar 535 has previously visited the medical facility, patient avatar 535 can review the results of any past meetings during the check in procedure. Patient avatar 535 can also be briefed during check in regarding how to act and/or what to say during an upcoming meeting with the physician.
  • In an exemplary embodiment, patient avatar 535 can exhibit emotion, facial characteristics, and/or other traits based on medical condition, experiences during past meetings with the physician, and/or occurrences unrelated to the medical problem. For example, patient avatar 535 can exhibit sadness because of the death of a pet. Alternatively, patient avatar 535 may exhibit fear because the physician was mean and aggressive during a past visit. Alternatively, patient avatar 535 may act arrogant, condescending, or skeptical because the physician made a mistake during a past visit. Thus, it can be seen that the medical simulation is a dynamic environment in which the present behavior and/or feelings of patient avatar 535 may be based on an accumulation of past occurrences and/or present circumstances.
  • In an exemplary embodiment, patient avatar 535 and receptionist avatar 530 can communicate to one another through dialog boxes. Patient avatar 535 can speak through a dialog box 540 and receptionist avatar 530 can speak through a dialog box 545. Depending on the embodiment, dialog box 540 and dialog box 545 may be visible to all users, may be visible to only a subset of users, or may be visible only to patient avatar 535 and receptionist avatar 530. Alternatively, dialog may be shown at a top of a computer screen through which the medical simulation is accessed, at a bottom of the computer screen, at a side of the computer screen, etc. In an alternative embodiment, patient avatar 535 may communicate through natural speech, and receptionist avatar 530 may respond through computer generated speech. In such an embodiment, domain specific speech recognition may be used, and the scope of the conversation may be limited to ensure that the speech of patient avatar 535 can be accurately recognized such that receptionist avatar 530 can respond appropriately. Alternatively, full natural speech processing may be used such that conversation is not limited.
  • The following is an exemplary description of an interaction between patient avatar 535 and receptionist avatar 530. Receptionist avatar 530 can be in a good mood because all clinic staff was recently given a raise. Patient avatar 535 can be impatient and slightly angry because of a previous bad experience at the medical facility and a general dislike of the medical profession. Patient avatar 535 can be in line behind other patient avatars (not shown) attempting to check in. When patient avatar 535 is first in line, receptionist avatar 530 can say “next please.” Patient avatar 535 can step up to reception desk 505 and receptionist avatar 530 can smile at patient avatar 535 and say “hello, is this your first time seeing us?”. Patient avatar 535 can exhibit an impatient facial expression and answer “no, I have been here several times in the past.” Receptionist avatar 530 can continue smiling at patient avatar 535 and say “last name please.” Patient avatar 535 can say and/or spell the requested last name. If patient avatar 535 does not need to prepare for the upcoming physician visit, receptionist avatar 530 can smile and say “thank you, please have a seat in the waiting area and your name will be called shortly.” Alternatively, patient avatar 535 may be provided with the option to prepare for the physician visit. In an exemplary embodiment, each interaction between a patient and a receptionist interaction can be different. For example, next time patient avatar 535 checks in, patient avatar 535 may be in a good mood, and receptionist avatar 530 may exhibit anger because of problems at home.
  • Once patient avatar 535 is checked in, patient avatar 535 can walk around reception area 500, use a restroom (not shown), purchase a snack from a vending machine (not shown), sit in one of chairs 515, read a magazine from magazine stand 520, etc. At any time after check in, a nurse or other staff member can contact patient avatar 535 and escort patient avatar 535 to an exam room. When patient avatar is in the exam room and ready to be examined, an alert can be provided to physician avatar 445 described with reference to FIG. 4. The alert can be a page, a telephone call, a text message, an e-mail, a pop-up text box, etc. Alternatively, an alert may not be provided, and physician avatar 445 may be expected to go to the exam room at the time scheduled for the appointment.
  • FIG. 6 is a diagram illustrating a corridor 650 within the medical facility in accordance with an exemplary embodiment. Within corridor 650 is a first exam room door 655 and a second exam room door 660. First exam room door 655 can include a first flag 665 to indicate the status of the first exam room. Similarly, second exam room door 660 can include a second flag 670 to indicate the status of the second exam room. As described above, first flag 665 and second flag 670 can be used to provide users with visual cues indicating the status of the exam rooms along corridor 650. The visual cue can be a color of the flag, a position of the flag, a shape of the flag, etc. The status may be an indication that the exam room is occupied by a patient, unoccupied, in need of cleaning, ready for a patient, occupied by a nurse, occupied by cleaners, etc. Corridor 650 can also include a computer terminal 675 such that users can access the EHR system, access information, and/or communicate with other users. A nurse avatar 680 can summon patient avatar 535 described with reference to FIG. 5 when an exam room is ready. Nurse avatar 680 can be a live user or a computer controlled avatar. Nurse avatar 680 can communicate with patient avatar 535 and/or physician avatar 445 according to any of the communication methods described herein.
  • FIG. 7 is a diagram illustrating an exam room 600 in accordance with an exemplary embodiment. Exam room 600 includes a medicine cabinet 605, a sink 610, an examination table 615, and a computer terminal 620. In alternative embodiments, exam room 600 can include any other items likely to be found in a genuine exam room. Physician avatar 445 can communicate through a dialog box 625, and patient avatar 535 can communicate through a dialog box 630. In an exemplary embodiment, the users playing physician avatar 445 and patient avatar 535 can each select dialog from a plurality of dialog choices. Alternatively, patient avatar 535 may be computer controlled, and the computer can select dialog based on the dialog used by physician avatar 445. The dialog choices can be crafted by medical professionals who have substantial experience in physician/patient interactions. As such, each interaction within the medical simulation can be tailored to emphasize a specific circumstance and teach one or more specific skills This tailoring allows medical students and young doctors to attain the equivalent of years of knowledge and experience through the medical simulation without making mistakes with real patients. In an alternative embodiment, the medical simulation can include language processing software such that natural speech can be used instead of dialog boxes.
  • As described above, present interactions involving patient avatar 535 can be based in large part on past interactions and present circumstances of patient avatar 535. Similarly, present interactions involving physician avatar 445 can be based in large part on the learning experiences of physician avatar 445 within the medical simulation. As such, physician avatar 445 can develop and change as the medical simulation progresses. For example, if a previous meeting between patient avatar 535 and physician avatar 445 was friendly and successful, physician avatar 445 may exhibit confidence and friendliness during a present meeting with patient avatar 535. Alternatively, present behaviors, expressions, and emotions may be based on the sum of the experiences of physician avatar 445 within the medical simulation. For example, if physician avatar 445 has had more unsuccessful interactions than successful interactions, physician avatar 445 may exhibit nervousness, fear, and hesitation, regardless of whether the last interaction between physician avatar 445 and patient avatar 535 was successful.
  • FIG. 8 is a diagram illustrating a virtual conference room 800 in accordance with an exemplary embodiment. Virtual conference room 800 can allow a plurality of users to simultaneously review patient or any of a host of other information, consult an expert, hold a roundtable discussion, or otherwise communicate with one another. Virtual conference room 800 includes a couch 805 and a table 810. Virtual conference room 800 can also include chairs, vending machines, decorations, artwork, windows, doors, and/or any other objects commonly found in a real-world conference room. Virtual conference room 800 also includes a computer terminal 815 such that users can access information and/or communicate with other users. A conference phone 820 can be used by users to receive audio data from patients, experts, professors, or other individuals. A video screen 825 can be used to provide video and/or textual information to users. Video screen 825 can also be used to hold video conferences with patients, experts, or any other individuals within the simulation.
  • Virtual conference room 800 includes a first avatar 830, a second avatar 835, and a third avatar 840. In an exemplary embodiment, first avatar 830, second avatar 835, and third avatar 840 can communicate with one another by any of the communication methods described herein. First avatar 830, second avatar 835, and third avatar 840 can also communicate with other individuals through computer terminal 815, conference phone 820, video screen 825, and/or any other communication device such as a pda, cellular telephone, pager, etc. As an example, first avatar 830 may be a radiation oncologist for a patient, second avatar 835 may be a primary doctor of the patient, and third avatar 840 may be a social worker who is working with the patient. The radiation oncologist, the primary doctor, and the social worker can discuss the patient with one another. The radiation oncologist, primary doctor, and social worker can also speak with a bioethics expert (not shown) through video screen 825 to obtain information regarding treatment of the patient.
  • Video conference room 800 also includes a data store 845. Data store 845 can include a variety of information regarding a medical topic, a patient, a physician, a facility, etc. Data store 845 can include a plurality of tabs such that specific information can be readily accessed and displayed. The information can be displayed on video screen 825 or a separate data screen (not shown). As an example, first avatar 830, second avatar 835, and third avatar 840 may be discussing a patient, and data store 845 may include information regarding the patient. A first tab of data store 845 may include x-rays of the patient, a second tab of data store 845 may include the results of laboratory work done on the patient, a third tab of data store 845 may include billing information associated with the patient, a fourth tab of data store 845 may include personal information of the patient, a fifth tab of data store 845 may include links to medical resources such as websites and journals, a sixth tab of data store 845 may initiate a connection with an expert or other consultant, etc. In an exemplary embodiment, users can select a tab of data store 845 to display the information included within the tab. Tabs may be selected by causing the avatar to touch the tab, by entering a command, or by any other method. In alternative embodiments, data store 845 may be constructed as a rotating data wheel, as a table, or as any other type of data structure which is accessible to the users.
  • The description with reference to FIGS. 4-8 is directed toward interactions among physicians, patients, and medical staff. However, the training system described herein is not limited to medical training. In an alternative embodiment, the training system may be used to train law students and attorneys how to successfully interact with clients. In such an embodiment, the training system dialog and/or scenarios within the simulation can be designed by law professors and experienced attorneys such that young attorneys can gain valuable experience without jeopardizing real life client relationships. For example, an attorney avatar may meet with a client avatar who is accused of first degree homicide. The attorney avatar can have a goal of convincing the client avatar that it is best if the client avatar tells the truth. In another interaction, the attorney avatar may meet with a client avatar who is seeking a divorce from her husband. The attorney avatar can have a goal of convincing the client avatar to control her emotions and be respectful towards her spouse while in the courtroom. Countless other scenarios can be used to teach attorneys how to interact with clients in virtually any area of law. In an alternative embodiment, the interaction can be a teacher/student interaction, a coach/player interaction, a manager/employee interaction, a dental hygienist/patient interaction, a nurse/patient interaction, etc.
  • One or more flow diagrams have been used to describe exemplary embodiments. The use of flow diagrams is not meant to be limiting with respect to the order of operations performed. The foregoing description of exemplary embodiments has been presented for purposes of illustration and of description. It is not intended to be exhaustive or limiting with respect to the precise form disclosed, and modifications and variations are possible in light of the above teachings or may be acquired from practice of the disclosed embodiments. It is intended that the scope of the invention be defined by the claims appended hereto and their equivalents.

Claims (19)

1. (canceled)
2. A method of interfacing with a computer system having a display device, the method comprising:
establishing a computer-controlled avatar within the computer system, wherein the computer-controlled avatar has a human appearance and is programmed to exhibit emotional behavior and cognitive behavior emulating a human in response to a user input received via an input device operably coupled to the computer, wherein the input device includes an audio input device to capture voice input and an imaging input device to capture body language input;
presenting the computer-controlled avatar to the user via the display device;
capturing the user input using the input device, wherein the user's input comprises a plurality of emotional components and a plurality of cognitive components; and
presenting a response to the user input via the avatar, wherein the response to the user input comprises a plurality of emotional components and a plurality of cognitive components.
3. The method according to claim 2, wherein the plurality of emotional components and the plurality of cognitive components of the user input comprises nonverbal input.
4. The method according to claim 3, wherein the nonverbal input comprises a psychometric input.
5. The method according to claim 3, wherein the nonverbal input comprises a body language input.
6. The method according to claim 5, wherein the body language input comprises a facial expression.
7. The method according to claim 6, wherein the facial expression comprises at least one of eye movement and eye contact.
8. The method according to claim 2, wherein the plurality of emotional components and the plurality of cognitive components of the user input comprises at least one element of speech prosody.
9. The method according to claim 2, wherein the plurality of emotional components and the plurality of cognitive components of the user input comprise at least one voice analysis input captured using the audio input device.
10. The method according to claim 9, wherein the at least one voice analysis input comprises one or more of a user stress level, a user nervousness level, and a user tone of voice.
11. The method according to claim 2, wherein the plurality of emotional components and the plurality of cognitive components of the user input comprises a timing of an interaction between the user and the avatar.
12. The method according to claim 2, wherein the plurality of emotional components and the plurality of cognitive components of the response to the user input comprises a tone of voice in which the avatar speaks.
13. The method according to claim 2, wherein the plurality of emotional components and the plurality of cognitive components of the response to the user input comprises body language exhibited by the avatar.
14. The method according to claim 13, wherein the body language exhibited by the avatar comprises at least one of eye movement and eye contact.
15. The method according to claim 2, further comprising:
defining a user goal; and
evaluating the user input against the user goal.
16. The method according to claim 15, further comprising iteratively repeating the steps of capturing the user input using the input device, evaluating the user input against the user goal, and presenting a response to the user input via the avatar until the user achieves the user goal.
17. The method according to claim 15, wherein the response to the user input comprises feedback regarding a relationship between the user input and the user goal.
18. A user interface for a computer system, comprising:
a scenario processor configured to present a computer-controlled avatar, wherein the computer-controlled avatar has a human appearance and is programmed to respond to a user input with emotional behavior and cognitive behavior emulating a human;
a capture processor in operable communication with the scenario processor and configured to capture the user input; and
an input device operably coupled to the capture processor, wherein the input device includes an audio input device to capture voice input and an imaging input device to capture body language input,
wherein the user input comprises a plurality of emotional components and a plurality of cognitive components, and
wherein the responds to the user input comprises a plurality of emotional components and a plurality of cognitive components.
19. The user interface according to claim 18, further comprising:
a goal processor configured to generate a user goal; and
an assessment processor in operable communication with the goal processor and the capture processor, wherein the assessment processor is configured to assess the user input against the user goal.
US13/924,205 2006-07-12 2013-06-21 Computerized medical training system Abandoned US20140127662A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US83018306P true 2006-07-12 2006-07-12
US11/776,978 US8469713B2 (en) 2006-07-12 2007-07-12 Computerized medical training system
US13/924,205 US20140127662A1 (en) 2006-07-12 2013-06-21 Computerized medical training system

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US13/924,205 US20140127662A1 (en) 2006-07-12 2013-06-21 Computerized medical training system
US14/266,042 US20140370468A1 (en) 2006-07-12 2014-04-30 Computerized medical training system
US14/742,802 US20150287330A1 (en) 2006-07-12 2015-06-18 Computerized medical training system

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US11/776,978 Continuation US8469713B2 (en) 2006-07-12 2007-07-12 Computerized medical training system

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US14/266,042 Continuation US20140370468A1 (en) 2006-07-12 2014-04-30 Computerized medical training system

Publications (1)

Publication Number Publication Date
US20140127662A1 true US20140127662A1 (en) 2014-05-08

Family

ID=38924187

Family Applications (4)

Application Number Title Priority Date Filing Date
US11/776,978 Active 2031-09-29 US8469713B2 (en) 2006-07-12 2007-07-12 Computerized medical training system
US13/924,205 Abandoned US20140127662A1 (en) 2006-07-12 2013-06-21 Computerized medical training system
US14/266,042 Abandoned US20140370468A1 (en) 2006-07-12 2014-04-30 Computerized medical training system
US14/742,802 Pending US20150287330A1 (en) 2006-07-12 2015-06-18 Computerized medical training system

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US11/776,978 Active 2031-09-29 US8469713B2 (en) 2006-07-12 2007-07-12 Computerized medical training system

Family Applications After (2)

Application Number Title Priority Date Filing Date
US14/266,042 Abandoned US20140370468A1 (en) 2006-07-12 2014-04-30 Computerized medical training system
US14/742,802 Pending US20150287330A1 (en) 2006-07-12 2015-06-18 Computerized medical training system

Country Status (9)

Country Link
US (4) US8469713B2 (en)
EP (1) EP2050086A2 (en)
JP (1) JP2009543611A (en)
KR (1) KR20090043513A (en)
CN (1) CN101506859A (en)
AU (1) AU2007272422A1 (en)
CA (1) CA2657176C (en)
MX (1) MX2009000206A (en)
WO (1) WO2008008893A2 (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8954041B1 (en) 2011-02-08 2015-02-10 Sprint Communications Company L.P. System and method for ID platform
US8972592B1 (en) 2011-05-27 2015-03-03 Sprint Communications Company L.P. Extending an interface pack to a computer system
US9043446B1 (en) * 2011-03-10 2015-05-26 Sprint Communications Company L.P. Mirroring device interface components for content sharing
US9123062B1 (en) 2011-02-18 2015-09-01 Sprint Communications Company L.P. Ad sponsored interface pack
US9183412B2 (en) 2012-08-10 2015-11-10 Sprint Communications Company L.P. Systems and methods for provisioning and using multiple trusted security zones on an electronic device
US9189607B1 (en) 2012-06-29 2015-11-17 Sprint Communications Company L.P. Mobile phone controls preprocessor
US9386395B1 (en) 2010-09-06 2016-07-05 Sprint Communications Company L.P. Dynamic loading, unloading, and caching of alternate complete interfaces
US9413839B2 (en) 2012-07-31 2016-08-09 Sprint Communications Company L.P. Traffic management of third party applications
US9442709B1 (en) 2012-10-24 2016-09-13 Sprint Communications Company L.P. Transition experience during loading and updating an interface and applications pack
US9483253B1 (en) 2015-04-30 2016-11-01 Sprint Communications Company L.P. Methods for customization of default applications on a mobile communication device
US9513888B1 (en) 2014-01-30 2016-12-06 Sprint Communications Company L.P. Virtual preloads
US9619810B1 (en) 2011-10-11 2017-04-11 Sprint Communications Company L.P. Zone architecture for dynamic targeted content creation
US10026328B2 (en) * 2014-10-21 2018-07-17 i-Human Patients, Inc. Dynamic differential diagnosis training and evaluation system and method for patient condition determination

Families Citing this family (86)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2007272422A1 (en) * 2006-07-12 2008-01-17 Medical Cyberworlds, Inc. Computerized medical training system
KR101218689B1 (en) * 2006-08-25 2013-01-04 각코호진 니혼시카다이가쿠 Medical training apparatus
US7572127B2 (en) * 2006-10-20 2009-08-11 Johnson Lanny L Operating room educational television “OREDUTV”.
US20080120558A1 (en) * 2006-11-16 2008-05-22 Paco Xander Nathan Systems and methods for managing a persistent virtual avatar with migrational ability
US9253183B2 (en) 2006-11-16 2016-02-02 Mark Stephen Meadows Systems and methods for authenticating an avatar
US20080231627A1 (en) * 2007-03-20 2008-09-25 Robert Allen Shearer Using Ray Tracing to Enhance Artificial Intelligence Character Behavior
JP4367663B2 (en) * 2007-04-10 2009-11-18 ソニー株式会社 Image processing apparatus, image processing method, program
US20080313973A1 (en) * 2007-06-19 2008-12-25 High Performance Marketing Method and apparatus for providing care
JP4506795B2 (en) 2007-08-06 2010-07-21 ソニー株式会社 Biological motion information display processing apparatus, a biological motion information processing system
US20090061402A1 (en) * 2007-08-29 2009-03-05 Kiran Musunuru Methods And Systems For Providing Interactive Educational Training
US7895049B2 (en) * 2007-11-30 2011-02-22 Yahoo! Inc. Dynamic representation of group activity through reactive personas
US20090164917A1 (en) * 2007-12-19 2009-06-25 Kelly Kevin M System and method for remote delivery of healthcare and treatment services
US20090264173A1 (en) * 2008-01-28 2009-10-22 Zois Chris Video game and method of play
US8005656B1 (en) * 2008-02-06 2011-08-23 Ankory Ran Apparatus and method for evaluation of design
US9443141B2 (en) * 2008-06-02 2016-09-13 New York University Method, system, and computer-accessible medium for classification of at least one ICTAL state
US8677254B2 (en) * 2008-07-24 2014-03-18 International Business Machines Corporation Discerning and displaying relationships between avatars
JP4565220B2 (en) * 2008-07-30 2010-10-20 学校法人 日本歯科大学 Medical training apparatus
US9223469B2 (en) * 2008-08-22 2015-12-29 Intellectual Ventures Fund 83 Llc Configuring a virtual world user-interface
US8562357B2 (en) * 2008-10-08 2013-10-22 American College Of Surgeons Interactive educational system and method
US9408537B2 (en) * 2008-11-14 2016-08-09 At&T Intellectual Property I, Lp System and method for performing a diagnostic analysis of physiological information
WO2010093780A2 (en) * 2009-02-13 2010-08-19 University Of Florida Research Foundation, Inc. Communication and skills training using interactive virtual humans
US20100217619A1 (en) * 2009-02-26 2010-08-26 Aaron Roger Cox Methods for virtual world medical symptom identification
US8977959B2 (en) * 2009-03-25 2015-03-10 International Business Machines Corporation Visualization of medical conditions in a virtual universe
US9100435B2 (en) 2009-04-02 2015-08-04 International Business Machines Corporation Preferred name presentation in online environments
US8806337B2 (en) * 2009-04-28 2014-08-12 International Business Machines Corporation System and method for representation of avatars via personal and group perception, and conditional manifestation of attributes
US8608481B2 (en) * 2009-05-13 2013-12-17 Medtronic Navigation, Inc. Method and apparatus for identifying an instrument location based on measuring a characteristic
US20100299155A1 (en) * 2009-05-19 2010-11-25 Myca Health, Inc. System and method for providing a multi-dimensional contextual platform for managing a medical practice
US8702426B2 (en) * 2009-05-26 2014-04-22 Charles Marion Soto Method and apparatus for teaching cosmetology
US20100302138A1 (en) * 2009-05-29 2010-12-02 Microsoft Corporation Methods and systems for defining or modifying a visual representation
US9754512B2 (en) 2009-09-30 2017-09-05 University Of Florida Research Foundation, Inc. Real-time feedback of task performance
KR101066515B1 (en) * 2009-10-29 2011-09-21 현대제철 주식회사 Control method of education system for operating device in iron foundry
WO2011058584A1 (en) * 2009-11-10 2011-05-19 Selex Sistemi Integrati S.P.A. Avatar-based virtual collaborative assistance
US20110165542A1 (en) * 2010-01-07 2011-07-07 Fairfield University Multi-parameter, customizable simulation building system for clinical scenarios for educating and training nurses and other health care professionals
US20110189638A1 (en) * 2010-02-03 2011-08-04 ImplementHIT System and method for learning assessment
US9138186B2 (en) * 2010-02-18 2015-09-22 Bank Of America Corporation Systems for inducing change in a performance characteristic
US20110229862A1 (en) * 2010-03-18 2011-09-22 Ohm Technologies Llc Method and Apparatus for Training Brain Development Disorders
JP5504462B2 (en) * 2010-06-22 2014-05-28 株式会社モリタ製作所 Medical training apparatus, medical training methods and programs
US20120129141A1 (en) * 2010-11-24 2012-05-24 Doreen Granpeesheh e-Learning System
JP5723701B2 (en) * 2011-07-04 2015-05-27 日立Geニュークリア・エナジー株式会社 Plant operation training apparatus
US9297819B2 (en) * 2011-07-22 2016-03-29 Sysmex Corporation Hematology analyzing system and analyzer
US9317653B2 (en) * 2011-07-22 2016-04-19 Sysmex Corporation Analyzer, and method for performing a measurement on a sample
JP2013088878A (en) * 2011-10-13 2013-05-13 Sony Corp Information processing system, information processing method, and program
US9870552B2 (en) 2011-10-19 2018-01-16 Excalibur Ip, Llc Dynamically updating emoticon pool based on user targeting
KR101252654B1 (en) 2011-12-09 2013-05-14 에이알비전 (주) Health care method for self-diagnosis
US20150010892A1 (en) * 2012-02-17 2015-01-08 Laerdal Medical As System and Method for Maintenance of Competence
KR20140135721A (en) * 2012-02-17 2014-11-26 라엘덜 메디칼 에이에스 Device to record competence
US20130257877A1 (en) * 2012-03-30 2013-10-03 Videx, Inc. Systems and Methods for Generating an Interactive Avatar Model
US9886873B2 (en) * 2012-04-19 2018-02-06 Laerdal Medical As Method and apparatus for developing medical training scenarios
US20130280686A1 (en) * 2012-04-19 2013-10-24 Martin Hetland Medical Procedure Training System
KR101314121B1 (en) * 2012-05-31 2013-10-15 홍금나 Performance system and method for role play
JP6007376B2 (en) * 2012-07-26 2016-10-12 学校法人 名城大学 Drug therapy decision capacity building methods and drug therapy decision capacity building program
US20160012349A1 (en) * 2012-08-30 2016-01-14 Chun Shin Limited Learning system and method for clinical diagnosis
US20140162220A1 (en) * 2012-12-11 2014-06-12 Quest 2 Excel, Inc. System, method and computer program product for gamification of business processes
US20140164037A1 (en) * 2012-12-11 2014-06-12 Quest 2 Excel, Inc. Gamified project management system and method
US9117316B1 (en) 2012-12-20 2015-08-25 Lockheed Martin Corporation Social identity models for automated entity interactions
CA2895778A1 (en) * 2012-12-20 2014-06-26 Accenture Global Services Limited Context based augmented reality
US20140220514A1 (en) * 2013-02-04 2014-08-07 Gamxing Inc. Games for learning regulatory best practices
US20140315172A1 (en) * 2013-03-15 2014-10-23 Curtis Cheeks, JR. Systems and methods for interactive scenario-based medical instruction
US20140278605A1 (en) * 2013-03-15 2014-09-18 Ncr Corporation System and method of completing an activity via an agent
US20150026231A1 (en) * 2013-07-19 2015-01-22 Springboard Technologies, LLC Facilitation of interaction with available subject matter experts (smes) associated with digital multimedia segment
JP6491207B2 (en) * 2013-08-16 2019-03-27 インテュイティブ サージカル オペレーションズ, インコーポレイテッド System and method for logging and regeneration between different devices
US20150086948A1 (en) * 2013-09-25 2015-03-26 Georgianna Donadio Behavioral change model with evidence-based, clinically proven communication skills and methods for providing the online education of the same
CN103491104B (en) * 2013-10-09 2016-08-17 廖洪銮 An interactive method and system based on non-real-time media information
JP6288548B2 (en) * 2013-10-09 2018-03-07 学校法人 日本歯科大学 Medical training apparatus
US9313646B2 (en) 2013-10-17 2016-04-12 At&T Intellectual Property I, Lp Method and apparatus for adjusting device persona
WO2015066542A1 (en) * 2013-10-31 2015-05-07 Understand.Com, Llc Video role-play learning system and process
US10311482B2 (en) 2013-11-11 2019-06-04 At&T Intellectual Property I, Lp Method and apparatus for adjusting a digital assistant persona
US20150179083A1 (en) * 2013-12-23 2015-06-25 Abb Technology Ag Interactive interface for asset health management
CN104042346A (en) * 2014-06-11 2014-09-17 丛中笑 Operating room monitoring system
US9277180B2 (en) * 2014-06-30 2016-03-01 International Business Machines Corporation Dynamic facial feature substitution for video conferencing
US9204098B1 (en) 2014-06-30 2015-12-01 International Business Machines Corporation Dynamic character substitution for web conferencing based on sentiment
US20170249854A1 (en) * 2014-08-08 2017-08-31 Baylor Research Institute Systems and methods for virtual learning environments
JP2016080752A (en) * 2014-10-10 2016-05-16 学校法人早稲田大学 Medical activity training appropriateness evaluation device
MX2017006191A (en) * 2014-11-12 2017-07-31 Baylor College Medicine Mobile clinics.
CN104680910A (en) * 2015-03-03 2015-06-03 罗娜 Medical simulation diagnosis and treatment teaching system based on cloud platform
US9786274B2 (en) 2015-06-11 2017-10-10 International Business Machines Corporation Analysis of professional-client interactions
KR101716572B1 (en) * 2015-06-22 2017-03-15 농협생명보험 주식회사 Roll playing game method for insuarance products sales education
WO2017061574A1 (en) * 2015-10-06 2017-04-13 株式会社MedVision Control system for surgical procedure simulator
EP3200044A1 (en) * 2016-01-29 2017-08-02 Tata Consultancy Services Limited Virtual reality based interactive learning
TWI597699B (en) * 2016-06-21 2017-09-01 Asian Landseed Medical Education Corp Medical diagnosis management education system and method thereof
CN106297464A (en) * 2016-08-19 2017-01-04 上海梅斯医药科技有限公司 Virtual diagnosis and treatment system
CN106296511A (en) * 2016-08-19 2017-01-04 上海梅斯医药科技有限公司 Virtual diagnosis and treatment system
WO2018097923A1 (en) 2016-11-22 2018-05-31 PraxiCut, LLC Surgical simulation systems, methods, and compositions
US20180218628A1 (en) * 2017-01-31 2018-08-02 Ent. Services Development Corporation Lp Information technology user behavior monitoring rule generation
WO2018195255A1 (en) * 2017-04-20 2018-10-25 Becton, Dickinson And Company Diabetes therapy training device
US10192410B1 (en) 2018-04-06 2019-01-29 Seeca Medical, Inc. System for providing notification of a status of a patient examination and related methods

Citations (43)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4528989A (en) * 1982-10-29 1985-07-16 Weinblatt Lee S Screening method for monitoring physiological variables
US4992867A (en) * 1990-02-28 1991-02-12 Weinblatt Lee S Technique for monitoring magazine readers while permitting a greater choice for the reader of possible reading positions
US5219322A (en) * 1992-06-01 1993-06-15 Weathers Lawrence R Psychotherapy apparatus and method for treating undesirable emotional arousal of a patient
US5406956A (en) * 1993-02-11 1995-04-18 Francis Luca Conte Method and apparatus for truth detection
US5676138A (en) * 1996-03-15 1997-10-14 Zawilinski; Kenneth Michael Emotional response analyzer system with multimedia display
US5864363A (en) * 1995-03-30 1999-01-26 C-Vis Computer Vision Und Automation Gmbh Method and device for automatically taking a picture of a person's face
US6102870A (en) * 1997-10-16 2000-08-15 The Board Of Trustees Of The Leland Stanford Junior University Method for inferring mental states from eye movements
US6190314B1 (en) * 1998-07-15 2001-02-20 International Business Machines Corporation Computer input device with biosensors for sensing user emotions
US6228038B1 (en) * 1997-04-14 2001-05-08 Eyelight Research N.V. Measuring and processing data in reaction to stimuli
US6353810B1 (en) * 1999-08-31 2002-03-05 Accenture Llp System, method and article of manufacture for an emotion detection system improving emotion recognition
US6401050B1 (en) * 1999-05-21 2002-06-04 The United States Of America As Represented By The Secretary Of The Navy Non-command, visual interaction system for watchstations
US20020135618A1 (en) * 2001-02-05 2002-09-26 International Business Machines Corporation System and method for multi-modal focus detection, referential ambiguity resolution and mood classification using multi-modal input
US20030125610A1 (en) * 2001-10-26 2003-07-03 Sachs Gary Steven Computer system and method for training certifying or monitoring human clinical raters
US6638217B1 (en) * 1997-12-16 2003-10-28 Amir Liberman Apparatus and methods for detecting emotions
US6697457B2 (en) * 1999-08-31 2004-02-24 Accenture Llp Voice messaging system that organizes voice messages based on detected emotion
US20040210159A1 (en) * 2003-04-15 2004-10-21 Osman Kibar Determining a psychological state of a subject
US20040249650A1 (en) * 2001-07-19 2004-12-09 Ilan Freedman Method apparatus and system for capturing and analyzing interaction based content
US20050289582A1 (en) * 2004-06-24 2005-12-29 Hitachi, Ltd. System and method for capturing and using biometrics to review a product, service, creative work or thing
US20060064037A1 (en) * 2004-09-22 2006-03-23 Shalon Ventures Research, Llc Systems and methods for monitoring and modifying behavior
US7027621B1 (en) * 2001-03-15 2006-04-11 Mikos, Ltd. Method and apparatus for operator condition monitoring and assessment
US7039676B1 (en) * 2000-10-31 2006-05-02 International Business Machines Corporation Using video image analysis to automatically transmit gestures over a network in a chat or instant messaging session
US7057662B2 (en) * 2002-11-22 2006-06-06 Hewlett-Packard Development Company, L.P. Retractable camera apparatus
US7068277B2 (en) * 2003-03-13 2006-06-27 Sony Corporation System and method for animating a digital facial model
US7076430B1 (en) * 2002-05-16 2006-07-11 At&T Corp. System and method of providing conversational visual prosody for talking heads
US7078911B2 (en) * 2003-02-06 2006-07-18 Cehelnik Thomas G Patent application for a computer motional command interface
US7092001B2 (en) * 2003-11-26 2006-08-15 Sap Aktiengesellschaft Video conferencing system with physical cues
US20060293921A1 (en) * 2000-10-19 2006-12-28 Mccarthy John Input device for web content manager responsive to browser viewers' psychological preferences, behavioral responses and physiological stress indicators
US7218320B2 (en) * 2003-03-13 2007-05-15 Sony Corporation System and method for capturing facial and body motion
US20070132597A1 (en) * 2005-12-09 2007-06-14 Valence Broadband, Inc. Methods and systems for monitoring patient support exiting and initiating response
US20070150916A1 (en) * 2005-12-28 2007-06-28 James Begole Using sensors to provide feedback on the access of digital content
US7246081B2 (en) * 2001-09-07 2007-07-17 Hill Daniel A Method of facial coding monitoring for the purpose of gauging the impact and appeal of commercially-related stimuli
US20070167690A1 (en) * 2005-12-19 2007-07-19 Olemi Trading Inc. Mind-body correlation data evaluation apparatus and method of evaluating mind-body correlation data
US20070265507A1 (en) * 2006-03-13 2007-11-15 Imotions Emotion Technology Aps Visual attention and emotional response detection and display system
US20070288263A1 (en) * 2005-12-09 2007-12-13 Valence Broadband, Inc. Methods and systems for monitoring quality and performance at a healthcare facility
US20080065468A1 (en) * 2006-09-07 2008-03-13 Charles John Berg Methods for Measuring Emotive Response and Selection Preference
US20080146890A1 (en) * 2006-12-19 2008-06-19 Valencell, Inc. Telemetric apparatus for health and environmental monitoring
US20090270170A1 (en) * 2008-04-29 2009-10-29 Bally Gaming , Inc. Biofeedback for a gaming device, such as an electronic gaming machine (egm)
US20100010370A1 (en) * 2008-07-09 2010-01-14 De Lemos Jakob System and method for calibrating and normalizing eye data in emotional testing
US20100010317A1 (en) * 2008-07-09 2010-01-14 De Lemos Jakob Self-contained data collection system for emotional response testing
US8136944B2 (en) * 2008-08-15 2012-03-20 iMotions - Eye Tracking A/S System and method for identifying the existence and position of text in visual media content and for determining a subjects interactions with the text
US20120078065A1 (en) * 2009-03-06 2012-03-29 Imotions - Emotion Technology A/S System and method for determining emotional response to olfactory stimuli
US8469713B2 (en) * 2006-07-12 2013-06-25 Medical Cyberworlds, Inc. Computerized medical training system
US8750576B2 (en) * 2012-04-24 2014-06-10 Taiwan Colour And Imaging Technology Corporation Method of managing visiting guests by face recognition

Family Cites Families (98)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5597312A (en) * 1994-05-04 1997-01-28 U S West Technologies, Inc. Intelligent tutoring method and system
US5682469A (en) * 1994-07-08 1997-10-28 Microsoft Corporation Software platform having a real world interface with animated characters
US6285380B1 (en) * 1994-08-02 2001-09-04 New York University Method and system for scripting interactive animated actors
US5880731A (en) * 1995-12-14 1999-03-09 Microsoft Corporation Use of avatars with automatic gesturing and bounded interaction in on-line chat session
US7811090B2 (en) * 1996-05-08 2010-10-12 Gaumard Scientific Company, Inc. Interactive education system for teaching patient care
US5853292A (en) * 1996-05-08 1998-12-29 Gaumard Scientific Company, Inc. Computerized education system for teaching patient care
US6443735B1 (en) * 1996-05-08 2002-09-03 Gaumard Scientific, Inc. Computerized education system for teaching patient care
US5867175A (en) * 1996-05-24 1999-02-02 Microsoft Corporation Method and apparatus for scriping animation
US6246975B1 (en) * 1996-10-30 2001-06-12 American Board Of Family Practice, Inc. Computer architecture and process of patient generation, evolution, and simulation for computer based testing system
US5884029A (en) * 1996-11-14 1999-03-16 International Business Machines Corporation User interaction with intelligent virtual objects, avatars, which interact with other avatars controlled by different users
JPH11197159A (en) * 1998-01-13 1999-07-27 Hitachi Ltd Operation supporting system
IL123073D0 (en) * 1998-01-26 1998-09-24 Simbionix Ltd Endoscopic tutorial system
US6077082A (en) * 1998-02-02 2000-06-20 Mitsubishi Electric Information Technology Center America, Inc. (Ita) Personal patient simulation
US6466213B2 (en) * 1998-02-13 2002-10-15 Xerox Corporation Method and apparatus for creating personal autonomous avatars
JPH11231770A (en) * 1998-02-19 1999-08-27 Mitsubishi Electric Corp Medical simulator reporting device
US6160986A (en) * 1998-04-16 2000-12-12 Creator Ltd Interactive toy
US6074213A (en) * 1998-08-17 2000-06-13 Hon; David C. Fractional process simulator with remote apparatus for multi-locational training of medical teams
US7198490B1 (en) * 1998-11-25 2007-04-03 The Johns Hopkins University Apparatus and method for training using a human interaction simulator
US6358053B1 (en) * 1999-01-15 2002-03-19 Unext.Com Llc Interactive online language instruction
US7120880B1 (en) * 1999-02-25 2006-10-10 International Business Machines Corporation Method and system for real-time determination of a subject's interest level to media content
US7107253B1 (en) * 1999-04-05 2006-09-12 American Board Of Family Practice, Inc. Computer architecture and process of patient generation, evolution and simulation for computer based testing system using bayesian networks as a scripting language
US7061493B1 (en) * 1999-04-07 2006-06-13 Fuji Xerox Co., Ltd. System for designing and rendering personalities for autonomous synthetic characters
US6296487B1 (en) * 1999-06-14 2001-10-02 Ernest L. Lotecka Method and system for facilitating communicating and behavior skills training
US6561811B2 (en) * 1999-08-09 2003-05-13 Entertainment Science, Inc. Drug abuse prevention computer game
US6507353B1 (en) * 1999-12-10 2003-01-14 Godot Huard Influencing virtual actors in an interactive environment
US20020005865A1 (en) * 1999-12-17 2002-01-17 Barbara Hayes-Roth System, method, and device for authoring content for interactive agents
US6826540B1 (en) * 1999-12-29 2004-11-30 Virtual Personalities, Inc. Virtual human interface for conducting surveys
US6807535B2 (en) * 2000-03-08 2004-10-19 Lnk Corporation Intelligent tutoring system
US6705869B2 (en) * 2000-06-02 2004-03-16 Darren Schwartz Method and system for interactive communication skill training
US6692258B1 (en) * 2000-06-26 2004-02-17 Medical Learning Company, Inc. Patient simulator
US20020008716A1 (en) * 2000-07-21 2002-01-24 Colburn Robert A. System and method for controlling expression characteristics of a virtual agent
US7319992B2 (en) * 2000-09-25 2008-01-15 The Mission Corporation Method and apparatus for delivering a virtual reality environment
AU3292802A (en) * 2000-11-03 2002-05-15 Zoesis Inc Interactive character system
JP2002157209A (en) * 2000-11-17 2002-05-31 Dorikomu:Kk Retrieval system using three-dimensional virtual space
US20030028498A1 (en) * 2001-06-07 2003-02-06 Barbara Hayes-Roth Customizable expert agent
US7610556B2 (en) * 2001-12-28 2009-10-27 Microsoft Corporation Dialog manager for interactive dialog with computer user
US7663628B2 (en) * 2002-01-22 2010-02-16 Gizmoz Israel 2002 Ltd. Apparatus and method for efficient animation of believable speaking 3D characters in real time
US7401295B2 (en) * 2002-08-15 2008-07-15 Simulearn, Inc. Computer-based learning system
US20040064298A1 (en) * 2002-09-26 2004-04-01 Robert Levine Medical instruction using a virtual patient
US20040179039A1 (en) * 2003-03-03 2004-09-16 Blattner Patrick D. Using avatars to communicate
GB0306875D0 (en) * 2003-03-25 2003-04-30 British Telecomm Apparatus and method for generating behavior in an object
US20040197750A1 (en) * 2003-04-01 2004-10-07 Donaher Joseph G. Methods for computer-assisted role-playing of life skills simulations
US7090576B2 (en) * 2003-06-30 2006-08-15 Microsoft Corporation Personalized behavior of computer controlled avatars in a virtual reality environment
US20050143174A1 (en) * 2003-08-19 2005-06-30 Goldman Daniel P. Systems and methods for data mining via an on-line, interactive game
US7725419B2 (en) * 2003-09-05 2010-05-25 Samsung Electronics Co., Ltd Proactive user interface including emotional agent
US20050054381A1 (en) * 2003-09-05 2005-03-10 Samsung Electronics Co., Ltd. Proactive user interface
US8990688B2 (en) * 2003-09-05 2015-03-24 Samsung Electronics Co., Ltd. Proactive user interface including evolving agent
US7607097B2 (en) * 2003-09-25 2009-10-20 International Business Machines Corporation Translating emotion to braille, emoticons and other special symbols
US20050223328A1 (en) * 2004-01-30 2005-10-06 Ashish Ashtekar Method and apparatus for providing dynamic moods for avatars
US8480403B2 (en) * 2004-02-02 2013-07-09 University Of Maryland, Baltimore Techniques for delivering medical care by improving decision-making skills of medical personnel
US20050255434A1 (en) * 2004-02-27 2005-11-17 University Of Florida Research Foundation, Inc. Interactive virtual characters for training including medical diagnosis training
US7836461B2 (en) * 2004-03-15 2010-11-16 Imi Innovations, Inc. Computer interface system using multiple independent hardware and virtual human-computer input devices and related enabling subroutines
CN1934581A (en) * 2004-03-26 2007-03-21 A.G.I.株式会社 Will expression model device, psychological effect program, and will expression simulation method
US7599838B2 (en) * 2004-09-01 2009-10-06 Sap Aktiengesellschaft Speech animation with behavioral contexts for application scenarios
US20060134585A1 (en) * 2004-09-01 2006-06-22 Nicoletta Adamo-Villani Interactive animation system for sign language
US20060122840A1 (en) * 2004-12-07 2006-06-08 David Anderson Tailoring communication from interactive speech enabled and multimodal services
US7464010B2 (en) * 2004-12-21 2008-12-09 Electronics And Telecommunications Research Institute User interface design and evaluation system and hand interaction based user interface design and evaluation system
US7797261B2 (en) * 2005-04-13 2010-09-14 Yang George L Consultative system
US20060248461A1 (en) * 2005-04-29 2006-11-02 Omron Corporation Socially intelligent agent software
US8024276B2 (en) * 2005-05-24 2011-09-20 Drane Associates, Lp Method for interactive learning and training
US20070021200A1 (en) * 2005-07-22 2007-01-25 David Fox Computer implemented character creation for an interactive user experience
US20070074114A1 (en) * 2005-09-29 2007-03-29 Conopco, Inc., D/B/A Unilever Automated dialogue interface
US20070166690A1 (en) * 2005-12-27 2007-07-19 Bonnie Johnson Virtual counseling practice
US7627536B2 (en) 2006-06-13 2009-12-01 Microsoft Corporation Dynamic interaction menus from natural language representations
GB0613832D0 (en) * 2006-07-12 2006-08-23 Univ Keele Virtual human interaction system
US8021160B2 (en) * 2006-07-22 2011-09-20 Industrial Technology Research Institute Learning assessment method and device using a virtual tutor
US8012023B2 (en) * 2006-09-28 2011-09-06 Microsoft Corporation Virtual entertainment
US20080104512A1 (en) * 2006-10-31 2008-05-01 Motorola, Inc. Method and apparatus for providing realtime feedback in a voice dialog system
WO2008067413A2 (en) * 2006-11-28 2008-06-05 Attune Interactive, Inc. Training system using an interactive prompt character
GB0703974D0 (en) * 2007-03-01 2007-04-11 Sony Comp Entertainment Europe Entertainment device
US20100114595A1 (en) * 2007-03-02 2010-05-06 Greg Richard Method and system for providing health information
US7814041B2 (en) * 2007-03-20 2010-10-12 Caporale John L System and method for control and training of avatars in an interactive environment
EP2140442A4 (en) * 2007-03-28 2015-04-15 Breakthrough Performancetech Llc Systems and methods for computerized interactive training
US20080268418A1 (en) * 2007-04-25 2008-10-30 Tashner John H Virtual education system and method of instruction
US8825468B2 (en) * 2007-07-31 2014-09-02 Kopin Corporation Mobile wireless display providing speech to speech translation and avatar simulating human attributes
US8700422B2 (en) * 2007-05-16 2014-04-15 Koninklijke Philips N.V. Apparatus and methods for medical patient role playing/simulation activity
TWI375933B (en) * 2007-08-07 2012-11-01 Triforce Co Ltd Language learning method and system thereof
US20090044112A1 (en) * 2007-08-09 2009-02-12 H-Care Srl Animated Digital Assistant
US20090098524A1 (en) * 2007-09-27 2009-04-16 Walton Brien C Internet-based Pedagogical and Andragogical Method and System Using Virtual Reality
US20090094517A1 (en) * 2007-10-03 2009-04-09 Brody Jonathan S Conversational advertising
US20100159434A1 (en) * 2007-10-11 2010-06-24 Samsun Lampotang Mixed Simulator and Uses Thereof
US9381438B2 (en) * 2007-11-07 2016-07-05 International Business Machines Corporation Dynamically displaying personalized content in an immersive environment
ITPO20080002A1 (en) * 2008-01-22 2009-07-23 Riccardo Vieri System and method for Advertising 'contextual generation during the sending of text messages on the device and interface.
US8156060B2 (en) * 2008-02-27 2012-04-10 Inteliwise Sp Z.O.O. Systems and methods for generating and implementing an interactive man-machine web interface based on natural language processing and avatar virtual agent based character
US7953255B2 (en) * 2008-05-01 2011-05-31 At&T Intellectual Property I, L.P. Avatars in social interactive television
US9552739B2 (en) 2008-05-29 2017-01-24 Intellijax Corporation Computer-based tutoring method and system
US8597031B2 (en) * 2008-07-28 2013-12-03 Breakthrough Performancetech, Llc Systems and methods for computerized interactive skill training
US8562357B2 (en) * 2008-10-08 2013-10-22 American College Of Surgeons Interactive educational system and method
US8159504B2 (en) * 2008-10-16 2012-04-17 At&T Intellectual Property I, L.P. System and method for presenting an avatar
WO2010074786A2 (en) * 2008-12-04 2010-07-01 Total Immersion Software, Inc. System and methods for dynamically injecting expression information into an animated facial mesh
US8600731B2 (en) * 2009-02-04 2013-12-03 Microsoft Corporation Universal translator
KR101558553B1 (en) * 2009-02-18 2015-10-08 삼성전자 주식회사 Avatar facial expressions control
US20100217619A1 (en) * 2009-02-26 2010-08-26 Aaron Roger Cox Methods for virtual world medical symptom identification
CA2755899A1 (en) * 2009-03-23 2010-09-30 Jay Shiro Tashiro Method for competency assessment of healthcare students and practitioners
US8977959B2 (en) * 2009-03-25 2015-03-10 International Business Machines Corporation Visualization of medical conditions in a virtual universe
US9489039B2 (en) * 2009-03-27 2016-11-08 At&T Intellectual Property I, L.P. Systems and methods for presenting intermediaries
US8195430B2 (en) * 2009-03-31 2012-06-05 Microsoft Corporation Cognitive agent
KR101597286B1 (en) * 2009-05-07 2016-02-25 삼성전자주식회사 Device and method for generating an avatar image message

Patent Citations (45)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4528989A (en) * 1982-10-29 1985-07-16 Weinblatt Lee S Screening method for monitoring physiological variables
US4992867A (en) * 1990-02-28 1991-02-12 Weinblatt Lee S Technique for monitoring magazine readers while permitting a greater choice for the reader of possible reading positions
US5219322A (en) * 1992-06-01 1993-06-15 Weathers Lawrence R Psychotherapy apparatus and method for treating undesirable emotional arousal of a patient
US5406956A (en) * 1993-02-11 1995-04-18 Francis Luca Conte Method and apparatus for truth detection
US5864363A (en) * 1995-03-30 1999-01-26 C-Vis Computer Vision Und Automation Gmbh Method and device for automatically taking a picture of a person's face
US5676138A (en) * 1996-03-15 1997-10-14 Zawilinski; Kenneth Michael Emotional response analyzer system with multimedia display
US6228038B1 (en) * 1997-04-14 2001-05-08 Eyelight Research N.V. Measuring and processing data in reaction to stimuli
US6102870A (en) * 1997-10-16 2000-08-15 The Board Of Trustees Of The Leland Stanford Junior University Method for inferring mental states from eye movements
US6638217B1 (en) * 1997-12-16 2003-10-28 Amir Liberman Apparatus and methods for detecting emotions
US6190314B1 (en) * 1998-07-15 2001-02-20 International Business Machines Corporation Computer input device with biosensors for sensing user emotions
US6401050B1 (en) * 1999-05-21 2002-06-04 The United States Of America As Represented By The Secretary Of The Navy Non-command, visual interaction system for watchstations
US6353810B1 (en) * 1999-08-31 2002-03-05 Accenture Llp System, method and article of manufacture for an emotion detection system improving emotion recognition
US6697457B2 (en) * 1999-08-31 2004-02-24 Accenture Llp Voice messaging system that organizes voice messages based on detected emotion
US20060293921A1 (en) * 2000-10-19 2006-12-28 Mccarthy John Input device for web content manager responsive to browser viewers' psychological preferences, behavioral responses and physiological stress indicators
US7039676B1 (en) * 2000-10-31 2006-05-02 International Business Machines Corporation Using video image analysis to automatically transmit gestures over a network in a chat or instant messaging session
US20020135618A1 (en) * 2001-02-05 2002-09-26 International Business Machines Corporation System and method for multi-modal focus detection, referential ambiguity resolution and mood classification using multi-modal input
US7027621B1 (en) * 2001-03-15 2006-04-11 Mikos, Ltd. Method and apparatus for operator condition monitoring and assessment
US20040249650A1 (en) * 2001-07-19 2004-12-09 Ilan Freedman Method apparatus and system for capturing and analyzing interaction based content
US7246081B2 (en) * 2001-09-07 2007-07-17 Hill Daniel A Method of facial coding monitoring for the purpose of gauging the impact and appeal of commercially-related stimuli
US20030125610A1 (en) * 2001-10-26 2003-07-03 Sachs Gary Steven Computer system and method for training certifying or monitoring human clinical raters
US7076430B1 (en) * 2002-05-16 2006-07-11 At&T Corp. System and method of providing conversational visual prosody for talking heads
US7057662B2 (en) * 2002-11-22 2006-06-06 Hewlett-Packard Development Company, L.P. Retractable camera apparatus
US7078911B2 (en) * 2003-02-06 2006-07-18 Cehelnik Thomas G Patent application for a computer motional command interface
US7068277B2 (en) * 2003-03-13 2006-06-27 Sony Corporation System and method for animating a digital facial model
US7218320B2 (en) * 2003-03-13 2007-05-15 Sony Corporation System and method for capturing facial and body motion
US20040210159A1 (en) * 2003-04-15 2004-10-21 Osman Kibar Determining a psychological state of a subject
US7092001B2 (en) * 2003-11-26 2006-08-15 Sap Aktiengesellschaft Video conferencing system with physical cues
US20050289582A1 (en) * 2004-06-24 2005-12-29 Hitachi, Ltd. System and method for capturing and using biometrics to review a product, service, creative work or thing
US20060064037A1 (en) * 2004-09-22 2006-03-23 Shalon Ventures Research, Llc Systems and methods for monitoring and modifying behavior
US20070132597A1 (en) * 2005-12-09 2007-06-14 Valence Broadband, Inc. Methods and systems for monitoring patient support exiting and initiating response
US20070288263A1 (en) * 2005-12-09 2007-12-13 Valence Broadband, Inc. Methods and systems for monitoring quality and performance at a healthcare facility
US20070167690A1 (en) * 2005-12-19 2007-07-19 Olemi Trading Inc. Mind-body correlation data evaluation apparatus and method of evaluating mind-body correlation data
US20070150916A1 (en) * 2005-12-28 2007-06-28 James Begole Using sensors to provide feedback on the access of digital content
US20070265507A1 (en) * 2006-03-13 2007-11-15 Imotions Emotion Technology Aps Visual attention and emotional response detection and display system
US8469713B2 (en) * 2006-07-12 2013-06-25 Medical Cyberworlds, Inc. Computerized medical training system
US20080065468A1 (en) * 2006-09-07 2008-03-13 Charles John Berg Methods for Measuring Emotive Response and Selection Preference
US20080146890A1 (en) * 2006-12-19 2008-06-19 Valencell, Inc. Telemetric apparatus for health and environmental monitoring
US20090270170A1 (en) * 2008-04-29 2009-10-29 Bally Gaming , Inc. Biofeedback for a gaming device, such as an electronic gaming machine (egm)
US20100010370A1 (en) * 2008-07-09 2010-01-14 De Lemos Jakob System and method for calibrating and normalizing eye data in emotional testing
US20100010317A1 (en) * 2008-07-09 2010-01-14 De Lemos Jakob Self-contained data collection system for emotional response testing
US8986218B2 (en) * 2008-07-09 2015-03-24 Imotions A/S System and method for calibrating and normalizing eye data in emotional testing
US8136944B2 (en) * 2008-08-15 2012-03-20 iMotions - Eye Tracking A/S System and method for identifying the existence and position of text in visual media content and for determining a subjects interactions with the text
US8814357B2 (en) * 2008-08-15 2014-08-26 Imotions A/S System and method for identifying the existence and position of text in visual media content and for determining a subject's interactions with the text
US20120078065A1 (en) * 2009-03-06 2012-03-29 Imotions - Emotion Technology A/S System and method for determining emotional response to olfactory stimuli
US8750576B2 (en) * 2012-04-24 2014-06-10 Taiwan Colour And Imaging Technology Corporation Method of managing visiting guests by face recognition

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9386395B1 (en) 2010-09-06 2016-07-05 Sprint Communications Company L.P. Dynamic loading, unloading, and caching of alternate complete interfaces
US8954041B1 (en) 2011-02-08 2015-02-10 Sprint Communications Company L.P. System and method for ID platform
US9123062B1 (en) 2011-02-18 2015-09-01 Sprint Communications Company L.P. Ad sponsored interface pack
US9043446B1 (en) * 2011-03-10 2015-05-26 Sprint Communications Company L.P. Mirroring device interface components for content sharing
US8972592B1 (en) 2011-05-27 2015-03-03 Sprint Communications Company L.P. Extending an interface pack to a computer system
US9619810B1 (en) 2011-10-11 2017-04-11 Sprint Communications Company L.P. Zone architecture for dynamic targeted content creation
US9189607B1 (en) 2012-06-29 2015-11-17 Sprint Communications Company L.P. Mobile phone controls preprocessor
US9413839B2 (en) 2012-07-31 2016-08-09 Sprint Communications Company L.P. Traffic management of third party applications
US9183412B2 (en) 2012-08-10 2015-11-10 Sprint Communications Company L.P. Systems and methods for provisioning and using multiple trusted security zones on an electronic device
US9811672B2 (en) 2012-08-10 2017-11-07 Sprint Communications Company L.P. Systems and methods for provisioning and using multiple trusted security zones on an electronic device
US9442709B1 (en) 2012-10-24 2016-09-13 Sprint Communications Company L.P. Transition experience during loading and updating an interface and applications pack
US9513888B1 (en) 2014-01-30 2016-12-06 Sprint Communications Company L.P. Virtual preloads
US10026328B2 (en) * 2014-10-21 2018-07-17 i-Human Patients, Inc. Dynamic differential diagnosis training and evaluation system and method for patient condition determination
US9483253B1 (en) 2015-04-30 2016-11-01 Sprint Communications Company L.P. Methods for customization of default applications on a mobile communication device

Also Published As

Publication number Publication date
CA2657176C (en) 2015-09-08
AU2007272422A1 (en) 2008-01-17
US20080020361A1 (en) 2008-01-24
US8469713B2 (en) 2013-06-25
KR20090043513A (en) 2009-05-06
WO2008008893A3 (en) 2008-12-11
US20140370468A1 (en) 2014-12-18
JP2009543611A (en) 2009-12-10
CA2657176A1 (en) 2008-01-17
EP2050086A2 (en) 2009-04-22
WO2008008893A2 (en) 2008-01-17
US20150287330A1 (en) 2015-10-08
WO2008008893A4 (en) 2009-01-29
CN101506859A (en) 2009-08-12
MX2009000206A (en) 2009-06-08

Similar Documents

Publication Publication Date Title
Cooke et al. Educating physicians: a call for reform of medical school and residency
Bosher et al. From needs analysis to curriculum development: Designing a course in health-care communication for immigrant students in the USA
Collins et al. The diffusion of effective behavioral interventions project: development, implementation, and lessons learned
Harden Ten questions to ask when planning a course or curriculum
Back et al. Efficacy of communication skills training for giving bad news and discussing transitions to palliative care
O'Connor Clinical instruction & evaluation: A teaching resource
Corey Theory and practice of counseling and psychotherapy
Beard et al. A survey of health-related activities on second life
Maynard et al. Conversation analysis, doctor–patient interaction and medical communication
Cormier et al. Interviewing and Change Strategies for Helpers: Fundamental Skills and Cognitive Behavioral Interventions, 6th
Benbassat et al. What is empathy, and how can it be promoted during clinical clerkships?
Kim et al. A conceptual framework for developing teaching cases: a review and synthesis of the literature across disciplines
Spiegelman Integrating Doctrine, Theory and Practice in the Law School Curriculum: The Logic of Jake's Ladder in the Context of Amy's Web
Clark Creating & sustaining civility in nursing education
Emerson Nursing education in the clinical setting
Dearmon et al. Effectiveness of simulation-based orientation of baccalaureate nursing students preparing for their first clinical experience
Rees et al. Narrative, emotion and action: analysing ‘most memorable’professionalism dilemmas
US20160155352A1 (en) Virtual counseling practice
Rees et al. “User involvement is a sine qua non, almost, in medical education”: learning with rather than just about health and social care service users
Pill et al. Can nurses learn to let go? Issues arising from an intervention designed to improve patients’ involvement in their own care
Howard A comparison of educational strategies for the acquisition of medical-surgical nursing knowledge and critical thinking skills: Human patient simulator vs. the interactive case study approach
Chan Interpretive phenomenology in health care research
CA2657176C (en) Computerized medical training system
Wear et al. Educating for professionalism: Creating a culture of humanism in medical education
Malau-Aduli Exploring the experiences and coping strategies of international medical students

Legal Events

Date Code Title Description
AS Assignment

Owner name: MEDICAL CYBERWORLDS, INC., WISCONSIN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KRON, FREDERICK W., DR.;FALSTEIN, NOAH;MARSELLA, STACY;SIGNING DATES FROM 20070724 TO 20070814;REEL/FRAME:030923/0715

STCB Information on status: application discontinuation

Free format text: ABANDONED -- AFTER EXAMINER'S ANSWER OR BOARD OF APPEALS DECISION