WO1995006296A1 - Procede et systeme automatises pour obtenir des informations confidentielles d'un patient - Google Patents

Procede et systeme automatises pour obtenir des informations confidentielles d'un patient Download PDF

Info

Publication number
WO1995006296A1
WO1995006296A1 PCT/US1994/009417 US9409417W WO9506296A1 WO 1995006296 A1 WO1995006296 A1 WO 1995006296A1 US 9409417 W US9409417 W US 9409417W WO 9506296 A1 WO9506296 A1 WO 9506296A1
Authority
WO
WIPO (PCT)
Prior art keywords
respondent
response
privacy
question
receiving
Prior art date
Application number
PCT/US1994/009417
Other languages
English (en)
Inventor
Paul D. Cumming
Ronald S. Karpf
Original Assignee
Talisman, Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Talisman, Ltd. filed Critical Talisman, Ltd.
Priority to AU76350/94A priority Critical patent/AU7635094A/en
Publication of WO1995006296A1 publication Critical patent/WO1995006296A1/fr

Links

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H10/00ICT specially adapted for the handling or processing of patient-related medical or healthcare data
    • G16H10/20ICT specially adapted for the handling or processing of patient-related medical or healthcare data for electronic clinical trials or questionnaires
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H10/00ICT specially adapted for the handling or processing of patient-related medical or healthcare data
    • G16H10/60ICT specially adapted for the handling or processing of patient-related medical or healthcare data for patient-specific data, e.g. for electronic patient records
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Definitions

  • This invention relates to a system for eliciting confidential information, and more specifically to a system for eliciting confidential medical histories and information from human beings.
  • blood components units of whole blood, red cells, platelets, and plasma, (collectively “blood components") necessary to sustain life in surgical procedures and used to
  • manufacture lifesaving pharmaceuticals can pose a
  • transcription errors may occur when responses are recorded by a human interviewer or key entry errors may occur when responses are entered into a computer database. Additionally, interviewer bias may affect the responses. The interviewer may also fail to ask all requisite questions. These errors and omissions can lead to citations from the U.S. Food and Drug
  • the questions are posed and/or answered and assuring that all questions are addressed and responses recorded correctly.
  • the questions may be asked so that the questions are audible only to the respondent and are not visible to other persons who may be present.
  • the respondent may, at his option, give his responses in a way that does not make them visible to other persons during the
  • the respondent also controls the volume of any audible output and may choose to suppress an audible echo of his responses. Allowing the respondent to select various options to ensure the confidentiality of his responses results in the respondent giving more accurate, complete, and truthful information.
  • the invention facilitates correct responses by improved communication because each question is displayed on a display screen, asked via audio, and illustrated with color pictures and graphics. The questions are presented consistently each time, thus controlling interviewer bias. The invention also eliminates transcription and key entry errors and the questions are stored in a manner that makes them easy to update.
  • the present invention overcomes the problems and disadvantages of the prior art by allowing the respondent to choose the language in which the
  • the invention also waits a predetermined time before accepting input from the respondent, to ensure that the respondent has had time to read or hear the question. Lastly, the invention keeps track of indirect information about responses, such as consistency with prior answers, whether the respondent answered then changed their response, and a length of time the respondent spent considering the question.
  • the invention is a system for gathering confidential data from a human respondent, comprising: a display device; means for presenting the human respondent with two privacy
  • the invention is a system for gathering confidential data from a human respondent, comprising: an audio output device; means for allowing the human respondent to control the volume of the audio output device; means for presenting the human respondent with two privacy alternatives that relate to a manner in which a response should be solicited; means for receiving an input from the human respondent choosing one of the privacy alternatives; means for soliciting, via the output device in accordance with the chosen privacy alternative, the response; and means for receiving the response from the respondent.
  • the invention is a system for gathering confidential data from a human respondent, comprising: a first input device; a second input device; means for displaying two privacy
  • means for receiving an input from the human respondent choosing one of the privacy alternatives means for soliciting a response from the human respondent; and means for receiving a response from the human respondent via one of the first and second input devices in
  • the invention is a system for gathering confidential data from a human respondent, comprising: an audio output device; means for receiving an input from the human respondent indicating a desired degree of privacy during questioning; means for soliciting the human respondent for a response; means for receiving a response from the human respondent; and means for echoing the response via the audio output device when the human respondent indicates a predetermined degree of privacy.
  • the invention is a system for gathering confidential data from a human respondent, comprising: an audio output device having a hands et and a loudspeaker; means for allowing the respondent to indicate a privacy alternative relating to whether a response is solicited via the handset or the loudspeaker; means for soliciting, via the audio output device in accordance with the indicated privacy
  • the response ; and means for receiving a response from the human respondent.
  • Fig. 1 is a block diagram of a preferred embodiment of the present invention
  • Fig. 2 shows an example format of a question screen displayed on a touchscreen of Fig. 1;
  • Fig. 3 shows an example of a question screen having the format of Fig. 2;
  • Fig. 4 is a picture of a touchpad of Fig. 1 with a template laid over it;
  • Fig. 5 shows an example of a display screen on a touchscreen of Fig 1;
  • Fig. 6 shows an example of a display screen on the touchscreen of Fig. 1;
  • Fig. 7 shows an example of a display screen on the touchscreen of Fig. 1;
  • Fig. 8 shows an example of a display screen on the touchscreen of Fig. 1;
  • Fig. 9 is a flowchart of a steps performed by a processor of Fig. 1;
  • Fig. 10 is a flowchart of a steps performed by a processor of Fig. 1;
  • Fig. 11 shows a format of data stored after a set of questions has been asked
  • Fig. 12(a) shows a format of a data structure used to implement a state machine indicating an order of questions and a screen format for each question
  • Fig. 12(b) shows a table of the data structure of Fig. 12(a).
  • the present invention allows respondents to answer questions based on their own sense of privacy in a language of their choice.
  • the questions are customized to the sex of the respondent and other factors, such as whether the
  • the respondent can tune the system to achieve his sense of privacy through volume controls, source of the audio (loudspeaker or handsets), whether to have verbal feedback of a response, and controlling the visibility of text and graphics on the screen that could provide any hint as to the nature of the questions to other persons present.
  • the respondent also controls the input source, using, e.g., a touchscreen or a more confidential touchpad device.
  • the audio of each question is accompanied by a display of full text of the question and by color
  • response buttons are not activated until after a predetermined period of time.
  • the sequence and content of questions is controlled by entries in an input setup file that is easily edited to change the text, color graphics, or question order.
  • Responses are collected and used as input to a general respondent qualification decision rule.
  • the decision rule is stored in a file or table and may be easily changed.
  • the responses or decision result may be output to a printer or CRT for aiding human decision making.
  • a blood donor is defined as any person who donates, either freely or for compensation, whole blood or any whole blood component, e.g., red cells, platelets or plasma.
  • Fig. 1 is a block diagram of a preferred embodiment of the present invention.
  • a system 100 of Fig. 1 is used to elicit confidential information from potential blood donors, but could also be used to elicit confidential information of other types, such as information relating to medical histories, employment or welfare eligibility.
  • System 100 of Fig. 1 includes a monitor 101 having a touch screen 102 and a speaker unit 103.
  • Speaker unit 103 includes a thumbnail switch volume control 105, a handset 104, and a loudspeaker 107.
  • System 100 also includes-a touchpad 106 connected to computer 110.
  • Computer 110 includes a memory 112 and a processor (CPU) 114.
  • Monitor 101 is connected to computer 110 in a manner known to persons of ordinary skill in the art. Other embodiments may use, e.g., a mouse keyboard, stylus, bar code
  • magstripe reader or a Braille touchpad as an input device instead of touchscreen 102, and may use, e.g., a headset or earphones as an audio output device instead of handset 104.
  • a question is always output verbally through speaker unit 103 using one of handset 104 and loudspeaker 107.
  • Speaker unit 103 is connected to computer 110.
  • the question is output over loudspeaker 107 when handset 104 is in its cradle. When handset 104 is picked up, the question is output over handset 104, and not over
  • loudspeaker 107 the respondent does not want the questions to be audible to other persons, he picks up handset 104, and the questions cannot be heard by others.
  • the determination of whether handset 104 is in its cradle preferably is made by circuitry internal to speaker unit 103.
  • the respondent can adjust the volume of both loudspeaker 107 and handset 104 to a level that the respondent feels is both comfortable and confidential using volume control 105.
  • Computer 110 sends signals to speaker unit 103 to control audio output and may receive status signals from speaker unit 103.
  • Computer 110 preferably is a Compudyne 486 PC manufactured by Compudyne Corp. of Addison, Texas.
  • Touchscreen 102 which attaches to the front of a standard computer monitor, preferably is a TouchWindow,
  • Touchpad 106 preferably is an UnMouse manufactured by MicroTouch, Inc. of Wilmington, Massachusetts.
  • Speaker unit 103 preferably is a Sound XChange unit, manufactured by InterActive, Inc. of Humboldt, South Dakota.
  • computer 110 is
  • Fig. 2 shows an example format of a question screen 200 displayed on touchscreen 102.
  • Screen 200 includes a number of windows, or fields, generated using Microsoft windows.
  • Field 202 preferably shows a still photograph of a nurse.
  • Field 202 is included to "humanize" the display and may be omitted in other embodiments.
  • Field 204 preferably shows the text of a question.
  • Field 204 may also be blank, as described below.
  • Field 206 preferably shows a color graphic associated with the question. Field 206 may also be blank, as described below.
  • screen 200 includes a title field 207, which contains a title of a type of screen 200 (e.g., "Medical History”); a label field 208, which contains a label of the question (e.g., "Question 1"); and an answer field 209, in which the respondent's current answer is echoed.
  • the respondent may change his answer at any time that the screen is displayed, and may go back to any screen to change his answer.
  • Screen 200 includes a plurality of buttons 210.
  • the "Back” button indicates that the respondent wants to display a previous question.
  • the "Yes” button, the “No” button, and the “Don't know” button each represent a possible response to a question.
  • the "Next” button indicates that the respondent wishes to go to the next question. In the described embodiment, it is possible to skip a question by touching the "Next" button.
  • Other embodiments may not allow N questions to be skipped, may have different buttons such as VCR-like buttons to control motion video, mutually exclusive choices of year buttons to indicate when respondent traveled outside the U.S., or different arrangements of the described buttons.
  • Fig. 3 shows an example of a question screen 300 displayed on touchscreen 102 and having the format of screen 200 of Fig. 2.
  • Field 307 preferably shows the title of the screen ("Medical History").
  • Field 302 preferably shows a still color photograph of a nurse.
  • Field 304 preferably shows the text of a question (e.g., "Have you read and do you understand the required
  • Fig. 4 is a picture of touchpad 106 with a template laid over it.
  • the template has printed buttons 410 similar to buttons 210 of Fig. 2.
  • Various sections of touchpad 106 correspond to buttons 410 and touching those sections of touchpad 106 (when touchpad 106 is enabled) has the same effect as touching buttons 210 on
  • buttons on touchpad 106 may include different or additional buttons on touchpad 106, e.g., buttons similar to those discussed below in
  • FIG. 5 shows an example of a display screen 500 on touchscreen 102.
  • Display screen 500 has four fields: a "language" field
  • Screen 500 also has a “continue” button 512, which indicates that the respondent wants to go to a next screen.
  • Each field 502-508 allows the respondent to select one of a plurality of alternatives for a privacy option. The respondent selects an alternative by touching the screen, or by using a mouse (not shown), touchpad 106, or some similar pointing/selection device.
  • Language field 502 allows the respondent to select the alternative of having questions displayed and/or spoken in English or Spanish.
  • the question displayed in field 202 of Fig. 2 is displayed in either English or Spanish and the question is output to speaker unit 103.
  • Other embodiments of the invention may allow the
  • N defaults to "English.”
  • Fields 504-508 allow the user to select other alternatives that control the degree of privacy for the session.
  • Field 504 allows the respondent to select whether text and graphics associated with each question are displayed on touchscreen 102. If the respondent does not wish potentially embarrassing text or graphics to be displayed, he selects "not shown.” Field 504 preferably defaults to "visible.”
  • Field 506 allows the respondent to select whether his responses are echoed (in field 209 of Fig. 2 and through speaker unit 103) or are not echoed. If the respondent does not wish potentially embarrassing responses to be echoed, he selects "off" and his responses are not echoed either audibly or on touchscreen 102. Field 506
  • Field 508 allows the respondent to select either touchscreen 102 or touchpad 106 as the input device when responding to confidential questions. If the respondent does not want other persons to see him making responses (as would happen with touchscreen 102), he selects
  • Field 508 preferably defaults to "touchscreen.”
  • the respondent can toggle the "visible"/"not shown” option. This option is toggled by touching any part of the text or graphic display on touchscreen 102. For example, if the text and graphics are displayed
  • the respondent can toggle the "touchscreen"/"private touchpad” option. This option is toggled by touching the desired input device. For example, if the respondent is using buttons 210 to indicate his responses, touching touchpad 106 will cause the system to accept input from touchpad 106, and not display the buttons 210 on the screen. Touching
  • FIG. 6 shows an example of a display screen 600 on touchscreen 102.
  • Display screen 600 has two fields: a "sex" field 602, and a "donation status" field 604.
  • Each field 602 and 604 allows the respondent to select one of a plurality of options.
  • Field 602 allows the respondent to indicate his or her sex.
  • Field 602 preferably defaults to "male.”
  • Field 604 allows the respondent to indicate whether he is a new donor or a repeat donor.
  • Field 604 preferably defaults to "new donor.”
  • a screen such as that of Fig. 3 is displayed.
  • a screen such as that of Fig. 7 is displayed for each question.
  • Fig. 7 is similar to Fig. 2, except that field 706 is blank and field 704 contains the words "Touch Here to View the Question.”
  • a screen such as that of Fig. 2 is displayed.
  • a screen such as that of Fig. 8 is displayed for each question.
  • Fig. 8 is similar to Fig. 3, except that buttons 210 are not displayed. Since the user will be indicating
  • buttons 710 are displayed. Since the user will be indicating responses on touchpad 106, there is no purpose in displaying buttons on touchscreen 102. Since the respondent desires a high degree of privacy, the text and color graphics are not displayed on touchscreen 102.
  • Fig. 9 is a flowchart of the steps performed by processor 114 of Fig. 1 executing a program stored in memory 112 during the operation of the described embodiment for blood donor screening.
  • processor 114 displays the screen of Fig. 5 on
  • the respondent may select different alternatives for the various options until he finalizes his options by touching "continue" button 512, which is verbally echoed.
  • the respondent's choices are stored by processor 114 in memory 112 or in some other storage medium, such as a hard disk (not shown).
  • storage medium is also intended to include, e.g., an optical and/or a CD ROM storage device.
  • processor 114 displays the screen of Fig. 6 on touchscreen 102 and allows the respondent to indicate his or her sex and whether he is a repeat donor.
  • the respondent finalizes his entry by touching "continue" button 612, which is verbally echoed.
  • Other embodiments may request that the respondent enter a social security number, use a bar code or magstripe identification card, or have their picture or signature recorded (not shown).
  • processor 114 displays screens having a format of Fig. 2 on touchscreen 102 (or Fig. 7 or Fig. 8, depending on the privacy options entered) and allows the respondent to enter his response to questions 1 through n.
  • the steps performed to ask a question and receive a response are described in connection with
  • Fig. 10 shows a path between step 908 and 906 to indicate that a previous question is asked when the "Back" button is touched.
  • processor 114 applies a decision rule to decide if the respondent will be allowed to donate blood.
  • Other embodiments may output the information for human decision making via printer or monitor instead of, or in addition to, the decision rule.
  • This decision rule is discussed below. If the result of the decision rule is "yes,” the respondent is allowed to donate blood and a message to that effect is displayed on touchscreen 102. If the result of the decision rule is "no,” the respondent is deferred and is not allowed to donate blood. A message to that effect is displayed on touchscreen 102. These messages are displayed whether or not the "not shown” option is turned on.
  • processor 114 saves the donor's status in memory 112 or in some other storage device, such as a hard disk, along with a social security number or other unique ID, and/or outputs results to a printer or another CRT.
  • an answer of "yes” to certain questions will cause the respondent to be deferred as a donor.
  • Other embodiments may use more sophisticated decision rules. For example, the following decision rule applies to a system where some questions should be answered “Yes” and some should be answered “No” and the system stores previous responses from previous sessions of each donor in a database.
  • the current set of questions and answers are Session n .
  • the previous set of questions and answers are Session n _ 1 .
  • the database may be stored in a memory of a computer at a central location, and processor 114 accesses the central computer's database via modem or a network. In other embodiments, the database may be stored in memory 112 or in a storage device of
  • processor 114 solicits and receives from the respondent a time since respondent's last donation.
  • the system also solicits and receives a type of donation occurring (or a type of donation is pre defined). If time since last donation ⁇ minimum acceptable time for this type of donation
  • a type of donation occurring determines the minimum acceptable period of time between donations. For example, homologous whole blood donors may donate approximately every eight weeks. Autologous donors may donate as often as once a week. Platelet donors may donate more
  • processor 114 may access a database to determine a time since last donation instead of eliciting this information from the respondent.
  • the result of the decision rule preferably is saved in memory 112.
  • the result may also be saved an disk, in a central database, presented to a medical professional via CRT for additional review, or output to a printer as permanent legal record or for other purposes.
  • Certain embodiments, such as those regulated by the FDA, will always print out or display a summary of all decisions and the data used to reach the decision, so that a human operator can verify the decision.
  • Such a system may allow the reviewer to store notes on why a donor was or was not allowed to donate and to store a signature of the reviewer and/or donor.
  • Other embodiments may not have provisions for human oversight.
  • the following paragraphs describe steps performed by processor 114 to ask one question. All of the steps of Fig.
  • step 10 processor 114 initializes variables representing an initial start time and an initial question number. The initial start time is also saved in a memory or other storage device.
  • step 1004 if a current question should be skipped because of the donor's sex, processor 114 sets a button (BTN) variable to "skip" in step 1006 and control passes to step 1022.
  • BTN button
  • step 1008 If, in step 1004, the question should not be skipped, control passes to step 1008.
  • processor 114 determines whether the respondent wants to display text and graphics associated with the question.
  • the respondent previously indicated his preference using field 504 of Fig. 5 (or by toggling the "display” option as discussed above). As discussed above, the respondent can toggle the "display” option at any time. Therefore, processor 114 frequently checks to see if a toggle has occurred. This check has not been included in the flow charts for ease of explanation. If text and graphics are to be displayed, processor 114 executes a "show" function in step 1012 that displays the text and color graphic for the current question on touchscreen 102 (see Fig. 3) and outputs the audio for the question to speaker unit 103. Processor 114 also sets an audiotimer variable to "short". The audiotimer variable is used in steps 1014-1016 to determine when to start accepting input from touchscreen 102 or touchpad 106. In the embodiment, when text and color graphics are
  • repeat donors are always given a "short" response time, irrespective of whether text and graphics are to be displayed.
  • processor 114 executes a "hide" function in step 1010 that does not display the text and color graphic for the current question (see Fig. 7) and that outputs the audio for the question to speaker unit 103.
  • Processor 114 also sets an audiotimer variable to "long". In the embodiment, when text and graphics are not displayed, it is assumed that the respondent must wait to hear the question spoken over speaker unit 103. Thus, a relatively long time passes before processor 114 begins accepting input.
  • processor 114 sets a software audio timer to the value of the audiotimer variable.
  • processor 114 displays buttons on touchscreen 102 (if the "touchscreen" option has been selected) (step 1017). No input is accepted from the selected input device until the software audio time has expired.
  • response buttons 210 are displayed and become active after three seconds of audio. If text and graphics are not visible, then the response buttons are not activated until the end of the audible question.
  • processor 114 waits for the respondent to touch a button and in step 1020, processor 114 sets the BTN variable to a value representing the button touched by respondent. In step 1022, processor 114 saves the end time in memory 112 or in another storage device. Because the start and end times for the question are both saved, it is possible to determine how long it took the
  • Processor 114 then saves the question number (q), the respondent's response (BTN) (i.e., the button touched by the respondent), and the amount of time (t) in memory 112 or a storage device.
  • processor 114 saves multiple responses, if the respondent repeats questions and gives more than one response. Other embodiments may only save the
  • Fig. 11 shows a format of a data structure used by processor 114 to store user responses for a set of
  • Session n Information for Session n includes a tag, such as a donor ID and a date and the respondent's responses (ql ... qt).
  • a donor ID can be, e.g., a social security number or a transaction ID number.
  • Each response (q n ) is stored as a question number, the response (possible values include "Yes", “No", “Don't know”, or unanswered), and the amount of time it took the respondent to answer the question.
  • processor 114 determines the flow of control in accordance with the button touched by the respondent (or if the question was skipped because of the sex of the respondent). If the respondent touched a
  • processor 114 displays the prior frame as determined from information in the "Back" field of the state variable described in Figure 12(a) described below. The result of this frame navigation may be to stay at the same question, if there are no previous questions.
  • processor 114 navigates to the next frame determined from information in the respective button fields of the state variable described in Figure 12(a) below.
  • the result of frame navigation may be to stay at the same question, if the current question is the last question.
  • HBIG Hepatitis B Immune Globulin
  • these questions may be derived from the American Association of Blood Banks (proposed) standard set, which are derived from
  • Fig. 12(a) shows a format of a data structure for each state in a state machine used by processor 114 to implement the flowcharts of Figs. 9 and 10.
  • the state machine is stored as a table (Fig. 12(b)) or a file in memory 112, where every table entry corresponds to a state S and has the format shown in Fig. 12(a).
  • the data structure includes a name of bitmap files for the graphic (field 206) and the picture of the nurse (field 202).
  • the data structure also includes names of files having, respectively, the text of the question (field 204), the label (field 208), and the title
  • the data structure also includes a name 1202 of a file having the audio that is to be sent to speaker unit 103 by processor 114.
  • the data structure has a plurality of "state numbers" 1204 to which control passes depending on which button was touched by the respondent ("Yes”, “No”, “Don't Know", “Back”, “Next”). (Skipped questions cause control to pass to the "Next” screen.)
  • the data structure has an indicator variable for whether a question applies only to females (F), males (M) or to both sexes (B).
  • the blood donor system described above can be used to ask other types or confidential questions, such as questions to screen plasma donors, questions relating to a medical history, employment experience, or questions related to eligibility for public assistance of the respondent. All that is necessary to adapt the system for a different set of questions is to change the text files containing the questions or change the graphics associated with the questions.
  • the order of the questions changes depends on the responses entered. To implement this feature, the values of state machine table of Fig. 12(b) would have to be changed. The format used by the state machine would not have to change, but could change depending on the application.
  • processor 114 may display and output to the speaker a completely different line of questions.
  • Fig. 3 could be replaced with a moving picture of a nurse.
  • full motion video preferably would be achieved through use of Microsoft Video for Windows, where the data required for the moving image is stored in a file or series of files as described in the Microsoft Video for Windows User's Guide available from Microsoft Corporation, which is herein incorporated by reference.
  • the picture in field 306 of Fig. 3 could be a full motion video picture or a third window containing full motion video could be added to a screen.
  • a video camera is added to the system of Fig. 1 (or a similar system).
  • Processor 114 displays a screen (not shown) that asks the respondent to step into a predetermined area and processor 114 activates the video camera to record a video record of the respondent in memory 112 or some other storage medium.
  • a screen not shown
  • processor 114 activates the video camera to record a video record of the respondent in memory 112 or some other storage medium.
  • Processor 114 reads respondent's signature from touchscreen 102 and stores a graphic representation of the signature in a storage device. Such an embodiment could also use a light pen as an input device.
  • Another embodiment solicits identification data from the respondent.
  • This data can be a social security number or ID number entered by the respondent via
  • touchscreen 102 touchpad 106
  • keyboard or by inserting a magnetic card in a magnetic card reader.
  • the system may assign an arbitrary ID to the respondent and associate the ID with the respondent's name stored in a separate file.
  • processor 114 displays a screen showing a medical "informed consent" form.
  • the respondent cannot skip the screen, but must touch a button on the screen (or some alternate input device) indicating that he has read and understood the consent form and that he has given his consent for the medical procedure mentioned in the form.
  • the consent form may be too long to fit in the window, and processor 114 scrolls the text of the consent form within the window in response to buttons touched by the respondent.
  • processor 114 scrolls the text of the consent form within the window in response to buttons touched by the respondent.
  • the user must scroll through the entire form before being allowed to enter a response because it is important to be; able to draw an inference that the respondent has read the consent form.
  • a processor 114 displays another screen (similar to that of Fig. 6) that allows the respondent to indicate whether this is the first time he has given responses to the questions (e.g., if he is a first time blood donor). If the respondent has never answered the questions before, processor 114 displays a series of complex questions and requires the respondent to wait several seconds before answering. If the respondent has seen the questions before, processor 114 displays a series of shorter questions, and allows the respondent to answer after a shorter period of time.
  • This embodiment requires that the state table of Figs. 12(a) and 12(b) indicate two transition states for each button: one transition state for new donors and one for repeat donors.
  • buttons 210 include a "Pause/Play” button that allows the respondent to pause and restart (play) the two questions. Some embodiments have background music or some type of background audio playing while waiting for a response from the respondent.
  • the Pause button allows the respondent to pause the background music.
  • a keyboard as an input device.
  • a mouse or stylus that moves a cursor on a screen as an input device.
  • a bar code or magstripe reader as an input device.
  • inventions store, retrieve and check a networked central database of the names of respondents that have been deferred or that have been identified as troublesome or litigious patients. These embodiments request that the respondent enter their name or their ID code. If processor 114 finds the name or ID code in the central database, processor 114 identifies the respondent as a potentially troublesome respondent. For example, the database could keep track of whether
  • processor 114 sends the responses to a central computer system over a modem, where the responses are stored in a central database.
  • the central computer stores a database of troublesome respondents. If the central computer determines that the respondent is a potentially
  • a warning signal is sent back to processor 114, and processor 114 prints or displays a warning message.
  • the present invention solicits responses to
  • the invention also eliminates transcription and key entry errors and
  • interviewer bias makes the system easy to update and change.
  • the questions are presented in a consistent manner, and can ensure that all questions are answered.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Epidemiology (AREA)
  • General Health & Medical Sciences (AREA)
  • Primary Health Care (AREA)
  • Biomedical Technology (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Pathology (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

Système informatique automatisé comprenant l'utilisation d'éléments audio et visuels pour poser des questions hautement confidentielles à des personnes et pour enregistrer leurs réponses. Le système permet à une personne interrogée d'accorder le système selon sa propre idée de la confidentialité par le choix de différents dispositifs audio - un haut-parleur ou un combiné plus privé; d'utiliser des boutons de commande de volume; de visualiser les questions et des éléments picturaux de support ou de les masquer de manière plus confidentielle; d'opter ou non pour une réaction verbale à ses réponses; et de choisir les boutons-réponse d'un écran tactile ou d'un pavé tactile plus privé. Les questions sont disponibles en plusieurs langues (anglais, espagnol et autres), et la séquence de questions est adaptée au sexe de la personne interrogée et au fait que ces questions sont posées pour la première fois ou l'ont déjà été. Les questions sont renforcées par trois groupes d'éléments - audio, textuels et visuels - afin de maximiser le niveau de compréhension. On retarde l'accès des personnes interrogées aux boutons-réponse jusqu'à ce qu'elles aient entendu ou lu suffisamment d'éléments pour comprendre la question. La séquence et le contenu des questions sont gérés par l'intermédiaire d'un fichier d'entrée qui peut être aisément modifié. Tandis que le processus d'interrogation se poursuit, chaque réponse, ainsi que des données indirectes à propos de chaque réponse sont enregistrées afin que des règles de décision plus complexes que les réponses justes/faux uniquement puissent être appliquées.
PCT/US1994/009417 1993-08-27 1994-08-25 Procede et systeme automatises pour obtenir des informations confidentielles d'un patient WO1995006296A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU76350/94A AU7635094A (en) 1993-08-27 1994-08-25 Automated system and method for eliciting confidential information from a patient

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US11245693A 1993-08-27 1993-08-27
US08/112,456 1993-08-27

Publications (1)

Publication Number Publication Date
WO1995006296A1 true WO1995006296A1 (fr) 1995-03-02

Family

ID=22344001

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1994/009417 WO1995006296A1 (fr) 1993-08-27 1994-08-25 Procede et systeme automatises pour obtenir des informations confidentielles d'un patient

Country Status (3)

Country Link
AU (1) AU7635094A (fr)
CA (1) CA2115878A1 (fr)
WO (1) WO1995006296A1 (fr)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1998040835A1 (fr) * 1997-03-13 1998-09-17 First Opinion Corporation Systeme de gestion de maladies
EP0969761A2 (fr) * 1996-03-27 2000-01-12 Michael Hersh Application de la technologie multimedia aux outils d'evaluation psychologique et educative
US6206829B1 (en) 1996-07-12 2001-03-27 First Opinion Corporation Computerized medical diagnostic and treatment advice system including network access
US6270456B1 (en) 1993-12-29 2001-08-07 First Opinion Corporation Computerized medical diagnostic system utilizing list-based processing
US8121868B1 (en) 2004-09-10 2012-02-21 James Grady Systems and methods for providing an inducement to purchase incident to a physician's prescription of medication
USRE43433E1 (en) 1993-12-29 2012-05-29 Clinical Decision Support, Llc Computerized medical diagnostic and treatment advice system
USRE43548E1 (en) 1993-12-29 2012-07-24 Clinical Decision Support, Llc Computerized medical diagnostic and treatment advice system
US8781848B1 (en) 2004-09-10 2014-07-15 Ldm Group, Llc Systems and methods for providing an inducement of a purchase in conjunction with a prescription
US9081879B2 (en) 2004-10-22 2015-07-14 Clinical Decision Support, Llc Matrix interface for medical diagnostic and treatment advice system and method
CN107205092A (zh) * 2017-06-14 2017-09-26 捷开通讯(深圳)有限公司 存储设备、移动终端及其语音保密播放方法

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5201034A (en) * 1988-09-30 1993-04-06 Hitachi Ltd. Interactive intelligent interface
EP0583896A2 (fr) * 1992-07-28 1994-02-23 Fujitsu Limited Appareil automatique pour transactions de monnaie

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5201034A (en) * 1988-09-30 1993-04-06 Hitachi Ltd. Interactive intelligent interface
EP0583896A2 (fr) * 1992-07-28 1994-02-23 Fujitsu Limited Appareil automatique pour transactions de monnaie

Cited By (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7306560B2 (en) 1993-12-29 2007-12-11 Clinical Decision Support, Llc Computerized medical diagnostic and treatment advice system including network access
US9005119B2 (en) 1993-12-29 2015-04-14 Clinical Decision Support, Llc Computerized medical diagnostic and treatment advice system including network access
USRE43548E1 (en) 1993-12-29 2012-07-24 Clinical Decision Support, Llc Computerized medical diagnostic and treatment advice system
US6270456B1 (en) 1993-12-29 2001-08-07 First Opinion Corporation Computerized medical diagnostic system utilizing list-based processing
USRE43433E1 (en) 1993-12-29 2012-05-29 Clinical Decision Support, Llc Computerized medical diagnostic and treatment advice system
US6641532B2 (en) 1993-12-29 2003-11-04 First Opinion Corporation Computerized medical diagnostic system utilizing list-based processing
EP0969761A2 (fr) * 1996-03-27 2000-01-12 Michael Hersh Application de la technologie multimedia aux outils d'evaluation psychologique et educative
EP0969761A4 (fr) * 1996-03-27 2000-01-12 Michael Hersh Application de la technologie multimedia aux outils d'evaluation psychologique et educative
US6491525B1 (en) 1996-03-27 2002-12-10 Techmicro, Inc. Application of multi-media technology to psychological and educational assessment tools
US7344496B2 (en) 1996-07-12 2008-03-18 Clinical Decision Support, Llc Computerized medical diagnostic system utilizing list-based processing
US6206829B1 (en) 1996-07-12 2001-03-27 First Opinion Corporation Computerized medical diagnostic and treatment advice system including network access
US6482156B2 (en) 1996-07-12 2002-11-19 First Opinion Corporation Computerized medical diagnostic and treatment advice system including network access
US7297108B2 (en) 1997-03-13 2007-11-20 Clinical Decision Support, Llc Disease management system and method including analysis of disease specific changes
US6770029B2 (en) 1997-03-13 2004-08-03 First Opinion Corporation Disease management system and method including correlation assessment
WO1998040835A1 (fr) * 1997-03-13 1998-09-17 First Opinion Corporation Systeme de gestion de maladies
US6234964B1 (en) 1997-03-13 2001-05-22 First Opinion Corporation Disease management system and method
US8533004B1 (en) 2004-09-10 2013-09-10 Ldm Group, Llc Systems and methods for patient communications in conjunction with prescription medications
US8615406B1 (en) 2004-09-10 2013-12-24 Ldm Group, Llc Systems and methods for content provision with a pharmacy transaction
US8781848B1 (en) 2004-09-10 2014-07-15 Ldm Group, Llc Systems and methods for providing an inducement of a purchase in conjunction with a prescription
US8781861B2 (en) 2004-09-10 2014-07-15 Ldm Group, Llc Systems and methods for providing an inducement to purchase incident to a physician's prescription of medication
US8121868B1 (en) 2004-09-10 2012-02-21 James Grady Systems and methods for providing an inducement to purchase incident to a physician's prescription of medication
US10311210B2 (en) 2004-09-10 2019-06-04 Ldm Group, Llc Systems and methods for providing an inducement of a purchase in conjunction with a prescription
US10984896B2 (en) 2004-09-10 2021-04-20 Ldm Group, Llc Systems and methods for providing an inducement to purchase incident to a physician's prescription of medication
US9081879B2 (en) 2004-10-22 2015-07-14 Clinical Decision Support, Llc Matrix interface for medical diagnostic and treatment advice system and method
CN107205092A (zh) * 2017-06-14 2017-09-26 捷开通讯(深圳)有限公司 存储设备、移动终端及其语音保密播放方法

Also Published As

Publication number Publication date
CA2115878A1 (fr) 1995-02-28
AU7635094A (en) 1995-03-21

Similar Documents

Publication Publication Date Title
Frisch et al. What’s in a definition? Holistic nursing, integrative health care, and integrative nursing: report of an integrated literature review
Hoffman Counseling the HIV-infected client: A psychosocial model for assessment and intervention
Tetzlaff Consumer informatics in chronic illness
US20040088317A1 (en) Methods, system, software and graphical user interface for presenting medical information
US20080319798A1 (en) Personalized medical information card and method for managing same
US20150261918A1 (en) System and method for medical services through mobile and wireless devices
JP2004514982A (ja) 疾患管理を医師ワークフローにて統合するためのシステム及び方法
Coleman Health literacy and clear communication best practices for telemedicine
Fredericksen et al. Provider perceptions of the value of same-day, electronic patient-reported measures for use in clinical HIV care
WO1995006296A1 (fr) Procede et systeme automatises pour obtenir des informations confidentielles d'un patient
Taieb-Maimon et al. Increasing recognition of wrong-patient errors through improved interface design of a computerized provider order entry system
Lor et al. “There are so many nuances...”: Health care providers’ perspectives of pain communication with Hmong patients in primary care settings
Fridey et al. A question of clarity: redesigning the American Association of Blood Banks blood donor history questionnaire—a chronology and model for donor screening
Griesemer et al. Examining ACCURE’s nurse navigation through an antiracist lens: transparency and accountability in cancer care
JP2023080373A (ja) 問診システム
McCall Phlebotomy essentials
JP2004078629A (ja) 診療支援装置
Piemonte More to the story: how the medical humanities can learn from and enrich health communication studies
Vickers et al. Interfaces for collecting data from patients: 10 golden rules
Nicholls et al. Patient partner perspectives regarding ethically and clinically important aspects of trial design in pragmatic cluster randomized trials for hemodialysis
Allyn et al. Patient safety culture: a real-life story of why culture matters
Zaini et al. Patient-Centred Communication in the Use of Antidepressants among People with Depression: A Scoping Review
Thompson et al. Accessibility and usability of a digital TV health information database
Adler Overcoming psychological insulin resistance
Saidi et al. Attendance To Eye Screening From The Eye of Healthcare Professionals: A Qualitative Finding

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AM AT AU BB BG BR BY CH CN CZ DE DK EE ES FI GB GE HU JP KE KG KP KR KZ LK LT LU LV MD MG MN MW NL NO NZ PL PT RO RU SD SE SI SK TJ TT UA UZ VN

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): KE MW SD AT BE CH DE DK ES FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase