EP1485855A2 - Navigation dans un formulaire de soins de sante electronique - Google Patents

Navigation dans un formulaire de soins de sante electronique

Info

Publication number
EP1485855A2
EP1485855A2 EP03716402A EP03716402A EP1485855A2 EP 1485855 A2 EP1485855 A2 EP 1485855A2 EP 03716402 A EP03716402 A EP 03716402A EP 03716402 A EP03716402 A EP 03716402A EP 1485855 A2 EP1485855 A2 EP 1485855A2
Authority
EP
European Patent Office
Prior art keywords
user
graphical
documentation
image element
graphical image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP03716402A
Other languages
German (de)
English (en)
Inventor
Catherine Britton
Kiron Rao
Terri H. Steinberg
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Siemens Medical Solutions USA Inc
Original Assignee
Siemens Medical Solutions Health Services Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US10/383,299 external-priority patent/US20040008223A1/en
Application filed by Siemens Medical Solutions Health Services Corp filed Critical Siemens Medical Solutions Health Services Corp
Publication of EP1485855A2 publication Critical patent/EP1485855A2/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/93Document management systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/10Text processing

Definitions

  • Certain exemplary embodiments of the present invention provide a method for creating user navigable graphical documentation for use in a healthcare information system, comprising the activities of: creating documentation by supporting a user in: importing a graphical image element from a repository, decomposing said graphical image element into a plurality of segments, establishing links between individual segments of said plurality of segments and an encompassing graphical image element to support navigation within said encompassing graphical image element responsive to a user navigation command, and linking a graphical image element segment with an object comprising text associated with said graphical image element segment.
  • the method can also comprise the activities of associating a name with said documentation, and storing said created documentation in response to a user command.
  • FIG. 1 is a flow diagram of an exemplary embodiment of a method 1000 of the present invention
  • FIG. 2 is a flow diagram of an exemplary embodiment of a method 2000 of the present invention.
  • FIG. 3 is a flow diagram of an exemplary embodiment of a method 3000 of the present invention.
  • FIG. 4 is a flow diagram of an exemplary embodiment of a method 4000 of the present invention.
  • FIG. 5 is a block diagram of an exemplary embodiment of a system 5000 of the present invention.
  • FIG. 6 is a block diagram of an exemplary embodiment of an information device 6000 of the present invention.
  • FIG. 7 is a diagram of an exemplary embodiment of a user interface 7000 of the present invention.
  • FIG. 8 is a diagram of an exemplary embodiment of a user interface 8000 of the present invention.
  • FIG. 9 is a diagram of an exemplary embodiment of a user interface 9000 of the present invention.
  • FIG. 10 is a diagram of an exemplary embodiment of a user interface 10000 of the present invention.
  • FIG. 11 is a diagram of an exemplary embodiment of a user interface 11000 of the present invention.
  • FIG. 12 is a diagram of an exemplary embodiment of a user interface 12000 of the present invention.
  • FIG. 13 is a diagram of an exemplary embodiment of a user interface 13000 of the present invention.
  • FIG. 14 is a diagram of an exemplary embodiment of a user interface 14000 of the present invention
  • FIG. 15 is a diagram of an exemplary embodiment of a user interface 15000 of the present invention.
  • FIG. 1 is a flow diagram of an exemplary embodiment of a method 1000 of the present invention. Note that although various activities are presented in a numbered sequence, and are connected with arrows to an exemplary embodiment of method 1000, there is no general requirement that the activities be performed in any particular order or any particular number of times, or that all activities be performed. Moreover, any activity can be performed automatically and/or manually. Also, any activity can be combined and/or performed in conjunction with any activity of any other method described herein.
  • a traditional paper form such as any paper form commonly used in healthcare management, may be scanned. Once generated via the scanning process, the resulting image may be stored as an electronic template form in a repository of forms.
  • a particular template form from a plurality of template forms in a forms repository may be selected and an image representing the template form may be rendered (as used herein, the word "rendered” means made perceptible to a human, via for example any visual and/or audio means, such as via a display, a monitor, electric paper, an ocular implant, a speaker, a cochlear implant, etc.).
  • a user may then modify the form by selecting a portion of the form that is of interest, and creating an template data field that may appear to overlay or underlay the portion of interest.
  • a user may create a template data field called "telephone number" by using a selection tool to draw and/or define a selection rectangle having borders that at least roughly correspond to the borders of a telephone number "box” that is part of the apparently underlying image.
  • the template form may be modified by linking a selected template data field, such as the field created in the preceding paragraph, to a user- selected object.
  • the object may be selected from a list of objects.
  • the list of objects may be created beforehand.
  • a user may select a related object from the list and modify that related object to reflect the attributes of the desired object, then name and save the desired object, such that the name of the desired object is displayed along with the names of other objects when the list of objects is rendered.
  • a saved object may be saved to a local directory and/or database and/or to a remote directory and/or database, such as a drive and/or database connected via a network, such as the Internet, an intranet, a public switched network, a private network, etc.
  • a link to the saved object may be any form of link, such as a hyperlink and/or URL.
  • the list of objects may be categorized and/or may present a particular category of objects.
  • the list may be associated with a particular role and/or title of a healthcare worker, such as "Admissions Administrator" or "Cardiac Care Nurse".
  • the list may be associated with a particular activity to be performed in providing healthcare to a patient, such as for example, admitting the patient to a healthcare facility, or fulfilling a laboratory testing request for the patient, or administering medication to the patient.
  • an object may be assigned to a category representing, for example, a worker role, worker title, and/or healthcare activity, etc., and the list may reflect that category and/or categorization.
  • the user-selected object may be created by selecting a name, data type, data length, and/or action for the object, etc. Once created, the object may be saved. The object may be related to other items in the list before, during, and/or after creation of the object.
  • the user-selected object may be usable for entering data into a database.
  • a template data field labeled "home telephone number” may be linked to a field in a patient database for home telephone number.
  • data entered for the template data field via the template form may be transferred to one or more databases, potentially depending on a particular role and/or title of a healthcare worker, and/or a particular activity that has been or will be performed in providing healthcare to a patient.
  • the user-selected object may be usable for determining a query to be used for soliciting data for entry in the user selected data field.
  • a query may be rendered indicating "What is the patient's home telephone number, including area code?"
  • a query may be rendered indicating "What is the patient's oral temperature, in degrees Fahrenheit?"
  • the user-selected object may be useable for forming a query of data associated with the user selected data field.
  • a name may be associated with the modified form, such as for example, "New Patient Admission Form” or "Medication Administration Record”.
  • the name may be suggested to a user.
  • the user may provide the name.
  • the modified form may be stored in the forms repository.
  • the form when a user selects the modified form, the form may be associated with a particular patient and/or a particular healthcare worker.
  • the form may be at least partially pre- populated with data regarding the patient when the form is selected.
  • the form may be at least partially pre- populated with data regarding the healthcare worker.
  • the form may be at least partially populated, as appropriate, with data regarding that person.
  • FIG. 2 is a flow diagram of an exemplary embodiment of a method 2000 of the present invention. Note that although various activities are presented in a numbered sequence, and are connected with arrows to an exemplary embodiment of method 2000, there is no general requirement that the activities be performed in any particular order or any particular number of times, or that all activities be performed. Moreover, any activity may be performed automatically and/or manually. Also, any activity may be combined and/or performed in conjunction with any activity of any other method described herein. [16] At activity 2100, patient information may be received. Such information may be received via any means, including for example, keyboard entry, voice-entry, selection from a list of patients, push technology, activation of a hyperlink contained in an e-mail message, etc.
  • a template form may be retrieved for scheduling a visit.
  • the template form may be user-selected via for example a graphical user interface, and/or associated with a visit scheduling activity and/or object.
  • a patient visit type in response to a user selection, a patient visit type, a visit appointment date and time, a service, and/or an activity may be selected.
  • a patient visit type may be selected from a list including, for example: routine physical, lab work, testing, counseling, out-patient procedure, etc.
  • a visit appointment date and time may be selected from a graphical user interface resembling, for example, a calendar and/or clock.
  • Services and/or activities may be selected from a list including, for example: measure blood pressure, measure weight, draw blood sample, provide exercise counseling, etc.
  • a scheduling form may be populated with the obtained and/or selected information, such as the patient identification information, patient visit type, visit appointment date and time, service, and/or activity.
  • the populated scheduling form may be communicated to a recipient application to enable a user of that application to schedule a patient visit, via for example a graphical user interface.
  • FIG. 3 is a flow diagram of an exemplary embodiment of a method 3000 of the present invention. Note that although various activities are presented in a numbered sequence, and are connected with arrows to an exemplary embodiment of method 3000, there is no general requirement that the activities be performed in any particular order or any particular number of times, or that all activities be performed. Moreover, any activity may be performed automatically and/or manually. Also, any activity may be combined and/or performed in conjunction with any activity of any other method described herein.
  • a image of a first anatomical feature such as that found in an anatomy treatise or textbook, may be scanned.
  • the first anatomical feature could be a human heart.
  • the resulting image may be stored as a first electronic image file in repository of such image files.
  • the first electronic image file may be generated via obtaining clip art of the desired first anatomical feature.
  • the first image file may be imported into an electronic document, such as via a "copy” and "paste” routine.
  • the first image file itself may be utilized as the electronic document.
  • the electronic document may be modified by decomposing the image of the anatomical feature into a plurality of segments, portions, and/or views of the anatomical feature, such as via creating an object corresponding to a chosen segment, portion, and/or view.
  • the anatomical feature is a human body
  • a portion of the body, such as the heart could be selected by using a selection tool to draw and/or define a selection polygon and/or shape having borders that at least roughly correspond to the borders of the heart as visible in the apparently underlying image of the human body.
  • the pixels and/or locations within the borders could correspond to locations a user might click and/or select to activate display of a linked object, such as a linked graphic image of the selected portion of the anatomical feature.
  • the bordered region may be named, grouped with other bordered regions, browsed, mapped to a database element, and/or have its own linked image.
  • anatomical feature is an arterial system and/or subsystem, such as the arteries serving the heart muscle itself
  • various arterial segments may be selected and assigned a corresponding object.
  • an object may be assigned to, for example, the mid LAD or to the distal RCA.
  • An object may inherit characteristics from a neighboring segment object. Thus, assuming a segment object has already been defined for the upper distal RCA, characteristics of that object may be provided to a newly created object for the middle distal RCA.
  • the object associated with the portion and/or view of the first image may be linked to a second image file, to enable navigation from the first image to the second image.
  • a second image file For example, via one or more lists and/or pop-up menus of objects representing human body parts, organs, views, and/or systems, and or heart components, views, and/or subsystems, the object corresponding to the human body may be linked to a detailed image of a heart to enable a user to navigate to the detailed image by clicking on the image of the human body in the vicinity of the heart.
  • the second image could be considered a child of the parent first image.
  • Any parent may have multiple children. Any child may be associated with multiple parents. Any child may specify a parent from which the child inherits one or more attributes and/or properties, such as a window size within which the image is displayed, font for any corresponding text, etc.
  • a parent may specify default properties for its children. In certain embodiments, a child may override such default properties. In certain embodiments, a child may not override such default properties.
  • the first image may render indicators of those regions to which objects are associated and/or second images are linked. Such indicators may be rendered as hot spots, mouse-overs, and/or a list of regions. For example, a user may click on an icon and/or press a particular keyboard combination and all linked regions will be displayed with bright red borders. As another example, a user may move a pointer over a region and its border will be displayed in red, and/or a textual label for the region will appear, and/or an address and/or name of the image to which the region is associated will be displayed.
  • a child may render indicators of each parent with which it is associated.
  • a child may render an indicator of one or more branches of its family tree. That is, if the child was rendered as a result of navigation from a grandparent image to a parent image to the child image, that navigational path may be rendered. Potentially, the rendering of the navigational path may include a hyperlink associated with each image in the path to enable rapid return to an image of interest.
  • any image may include a display of its descendants to any desired number of generations, thereby enabling rapid navigation to a particular descendant of interest, such as a great-grandchild image. Such a display of descendants may be in the form of a tree having branches with names for the corresponding descendant and/or miniature previews of each descendant.
  • a name may be associated with the modified electronic document that comprises the object linked to the second image file.
  • the modified electronic document may be stored.
  • portions of the modified electronic document may be named and/or stored. For example, a user may specify that only the graphical aspects of an electronic document are to be stored in a file of a particular name. As another example, a user may specify the storage of only the textual aspects of an electronic document. As another example, a user may specify the storage of both the graphical and textual aspects of an electronic document, but without any objects that link to databases and/or other documents.
  • a miniature preview of the electronic document may be named and/or stored, either individually and/or combined with any portion of the electronic document, including the entire electronic document.
  • the created and/or modified documentation may be associated with, for example, a particular patient, a particular healthcare activity, a particular procedure, and/or a particular healthcare worker.
  • data related to the associated patient, healthcare activity, procedure, and/or healthcare worker may be included in the document.
  • activities 3300 through 3400 may be repeated for additional segment, portions, and/or views of the first anatomical feature.
  • various portions of the first anatomical feature may be linked to detailed views of, for example, the head, brain, digestive tract, lungs, urinary tract, blood vessels, etc.
  • activities 3100 through 3600 may be repeated using the second image file as a starting point.
  • an electronic document providing an image of the human heart may have associated navigable objects, each linking a different portion of the image of the heart (such as the ventricles, arteries, veins, etc.) to a detailed image of that portion.
  • Such a detailed image may be more than merely a magnification of the parent image. Instead, it may contain additional detail and/or objects not found in the parent image.
  • a user who views for example, the first electronic document displaying the image of the human body may navigate to a detailed image of the heart by clicking in the vicinity of the heart.
  • the user may click in the region of the left ventricle to cause a detailed image of the left ventricle to be rendered.
  • embodiments of method 3000 may provide customizable interactive graphical documents.
  • the object may be linked and/or associated with an element of one or more databases, such as a field of a database. For example, clicking on a predetermined location and/or area of a graphical image may generate one or more queries to a database and potentially return data contained within one or more fields of the database.
  • the object may be defined such that selecting a particular location and/or area of a graphical image, such as via clicking, may allow and/or cause entry of data into a corresponding field of one or more databases. Data entry may occur via any means, including keying, clicking, gesturing, speaking, etc. Data entry may be implied via the nature of the defined object and/or a sequence of preceding events.
  • the object may be linked and/or associated with a location in an electronic document. For example, clicking on a predetermined location and/or area of a graphical image may cause an electronic document to open and/or a predetermined portion of the electronic document to be rendered. For instance, clicking on an image of a left ventricle in an image of a human heart could cause one or more paragraphs from a treatise, report, or paper relating to the left ventricle to be displayed. In certain embodiments, a list of treatises, reports, and/or papers containing such paragraphs could be rendered, enabling the user to select the desired source for display.
  • textual information corresponding to the displayed anatomical feature may be rendered.
  • textual information associated with the heart may be displayed when an image of the heart is displayed.
  • Such information may describe the names of various regions of the heart, measurements, data, conditions, observations, diagnosis, recommendations, treatment plan, surgical plan, intervention plan, and/or prognosis relating to a particular patient's heart, heart regions, and/or heart systems.
  • the textual information may be rendered within, over, near, and/or next to the graphical image.
  • the textual information may be rendered independently of the graphical information, such as in a separate window that may be opened and closed while viewing the electronic document containing the image.
  • graphical images may appear to overlay other graphical images.
  • a graphical image showing an arterial view of the heart may include an image of a stent that has been prescribed and/or implanted as an intervention for a stenosis condition. Either displayed with the arterial view, or by clicking on image of the stent included in the arterial view, textual information regarding the stent may be rendered, such as for example, its dimensions, materials, features, manufacturer, brand, style, item number, implantation technique, date of implantation, implantation location, current location, etc.
  • a graphical user interface providing various tools may be provided for drawing and or placing various shapes and/or images such that they appear over the apparently underlying image file.
  • a toolbox containing various types of stent objects may also be displayed, allowing the viewer to place an object comprising an image of a stent over an appropriate location of the arterial image.
  • the stent may be anchored to one or more particular locations in the underlying image. For example, both ends of the stent may be anchored to desired locations in the underlying artery.
  • the exemplary stent object may be linked to various textual data regarding the stent. For example, upon selecting a stent object from the toolbox, a user may be queried for the type and/or manufacturer of the desired stent by presenting a list of stent types and/or manufacturers. In certain embodiments, the user may specify as much or as little information about the stent as is appropriate for the particular situation, with the option to specify additional and/or different information at a later time.
  • the selection of an object may be linked to one or more databases, such as a supplies inventory database.
  • selection of an object may potentially indicate that one or more physical objects corresponding to the selected electronic object have been used, consumed, and/or removed from inventory, potentially triggering re-ordering of the physical object to restore the inventory.
  • selection of an object may indicate that certain procedures may and/or will be performed, thereby potentially defining certain physical tasks to be performed. For example, selection of a stent may indicate that the stent was implanted, implying that various surgical tools were utilized, and implying that those surgical tools should be expected to soon arrive at a cleaning facility for sterilization. Such information may guide management of activities at the cleaning facility.
  • Such objects may include anatomical variations (e.g., tilted bladder, enlarged ventricle, muscular atrophy, osteoporosis, etc.), anatomical injuries (e.g., a collapsed lung, broken bone, torn miniscus, scar tissue, etc.), anatomical diseases (e.g., cirrhosis, ulcer, clogged artery, etc.), and/or surgical and/or diagnosis techniques (e.g., an appendectomy, laparoscopy, endoscopy, etc.).
  • anatomical variations e.g., tilted bladder, enlarged ventricle, muscular atrophy, osteoporosis, etc.
  • anatomical injuries e.g., a collapsed lung, broken bone, torn miniscus, scar tissue, etc.
  • anatomical diseases e.g., cirrhosis, ulcer, clogged artery, etc.
  • surgical and/or diagnosis techniques e.g., an appendectomy, laparoscopy, endoscopy
  • an object may be selected that overlays an image of a colon with attached normal appendix with an image of a colon with an inflamed appendix.
  • an object may be selected that overlays an image of a colon with attached normal appendix with an image of a colon with a removed appendix.
  • an object may be selected that provides an image of an endoscope that may be manipulated to correspond to the contours of an underlying colon.
  • FIG. 4 is a flow diagram of an exemplary embodiment of a method 4000 of the present invention. Note that although various activities are presented in a numbered sequence, and are connected with arrows to an exemplary embodiment of method 4000, there is no general requirement that the activities be performed in any particular order or any particular number of times, or that all activities be performed. Moreover, any activity may be performed automatically and/or manually. Also, any activity may be combined and/or performed in conjunction with any activity of any other method described herein.
  • patient identification information may be received by a user, and entered into a computer interface, such as a graphical user interface.
  • a computer interface such as a graphical user interface.
  • the patient identification information may be received by a computer system.
  • user navigable graphical documentation may be retrieved and rendered to a user.
  • Such documentation may be created using any appropriate method, including method 3000.
  • the user navigable graphical documentation may be updated to reflect a patient condition, including measurements, data, observations, diagnoses, recommendations, treatment plans, surgical plans, intervention plans, and/or prognoses.
  • the documentation may include data related to a particular healthcare activity, a particular procedure, and/or a particular healthcare worker.
  • the updated documentation may be stored in association with the patient's medical records.
  • FIG. 5 is a block diagram of an exemplary embodiment of a system 5000 of the present invention.
  • system 5000 may be viewed as illustrative, and unless specified otherwise, should not be construed to limit the implementation of any of methods 1000, 2000, 3000, and/or 4000, and/or the scope of any claims attached hereto.
  • System 5000 may comprise one or more information devices 5100, 5200, 5300 inter-connected via a network 5400. Any of information devices 5100, 5200, 5300 may have any number of databases coupled thereto. For example, information device 5100 may be coupled to and/or host databases 5120 and 5140, information device 5200 may be coupled to and/or host database 5220, and/or information device 5300 may be coupled to and/or host databases 5320 and 5340. Moreover, any information device may act as a bridge, gateway, and/or server of its databases to any other information device. Thus, for example, information device 5100 may access database 5320 via information device 5300.
  • a scanner 5160 may be coupled to any of information devices 5100, 5200, 5300.
  • Network 5400 may be any type of communications network, including, for example, a packet switched, connectionless, IP, Internet, intranet, LAN, WAN, connection-oriented, switched, and/or telephone network.
  • FIG. 6 is a block diagram of an exemplary embodiment of an information device 6000 of the present invention.
  • Information device 6000 may represent any of information devices 5100, 5200, 5300 of FIG. 5.
  • information device 6000 may be implemented on a general purpose or special purpose computer, such as a personal computer, workstation, server, minicomputer, mainframe, supercomputer, laptop, and/or Personal Digital Assistant (PDA), etc., a programmed microprocessor or microcontroller and/or peripheral integrated circuit elements, an ASIC or other integrated circuit, a hardware electronic logic circuit such as a discrete element circuit, and/or a programmable logic device such as a PLD, PLA, FPGA, or PAL, or the like, etc.
  • PLD Personal Digital Assistant
  • Information device 6000 may include well-known components such as one or more communication interfaces 6100, one or more processors 6200, one or more memories 6300 containing instructions 6400, and/or one or more input/output (I/O) devices 6500, etc.
  • communication interface 6100 may be and/or include a bus, connector, network adapter, wireless network interface, wired network interface, modem, radio receiver, transceiver, and/or antenna, etc.
  • Each processor 6200 may be a commercially available general-purpose microprocessor.
  • the processor may be an Application Specific Integrated Circuit (ASIC) or a Field Programmable Gate Array (FPGA) that has been designed to implement in its hardware and/or firmware at least a part of a method in accordance with an embodiment of the present invention.
  • ASIC Application Specific Integrated Circuit
  • FPGA Field Programmable Gate Array
  • Memory 6300 may be coupled to processor 6200 and may comprise any device capable of storing analog or digital information, such as a hard disk, Random Access Memory (RAM), Read Only Memory (ROM), flash memory, a compact disk, a digital versatile disk (DVD), a magnetic tape, a floppy disk, and any combination thereof.
  • Memory 6300 may also comprise a database, an archive, and or any stored data and/or instructions.
  • memory 6300 may store instructions 6400 adapted to be executed by processor 6200 according to one or more activities of a method of the present invention.
  • Instructions 6400 may be embodied in software, which may take any of numerous forms that are well known in the art, including for example, Visual Basic by Microsoft Corporation of Redmond, Washington. Instructions 6400 may control operation of information device 6000 and/or one or more other devices, systems, or subsystems coupled thereto.
  • I/O device 6500 may be an audio and/or visual device, including, for example, a monitor, display, indicator, light, keyboard, keypad, touchpad, pointing device, microphone, speaker, telephone, fax, video camera, camera, scanner, and/or printer, including a port to which an VO device may be attached, connected, and/or coupled.
  • FIG. 7 is a diagram of an exemplary embodiment of a user interface 7000 of the present invention.
  • User interface 7000 may render an image 7010 of a scanned healthcare form, such as a new patient data form that might be used for creating a medical record file for a new patient, and/or for admitting a new patient.
  • Image 7010 might include a logo 7100 of the healthcare provider, a title for the form, and various fields 7300, such as fields for last name 7310, first name 7320, middle name 7330, social security number 7340, street address 7350, city 7360, state 7370, zip code 7380, and/or home telephone number 7390.
  • FIG. 8 is a diagram of an exemplary embodiment of a user interface 8000 of the present invention.
  • User interface 8000 may render an image of a toolbox, window, or palatte 8100, that may include various tools or controls for creating, specifying, and/or manipulating objects to be associated with an electronic document that includes image 7010 of FIG. 7.
  • Such controls may include tools for creating a label (L) 8250, text box (TB) 8200, combo box (CB) 8350, list box (LB) 8300, linked image 8400, object property 8500, and/or object criteria 8600.
  • a save tool 8700 may also be provided for commanding that a document be saved.
  • a user can, for example, select "New Template” from a "Template” menu. From toolbox 8100, the user may click on a desired control and drag it into position on the new template to add that control to the new template.
  • the user may change properties of the selected control and/or the template as desired. For example, a user may specify an appearance, background color, background style, border style, foreground color, font, font style, font size, alignment, line spacing, indent, maximum data length, validation, query, cursor type, pointer type, autosizing, position, and/or dimension, etc. for a control.
  • a control may be associated with a database field. Data entry via the control may be prompted by a query. Data entry via the control may be validated. Searches of the database may be performed using one or more queries entered via one or more controls.
  • a user may specify a name, start-up behavior, access control, password, window type, window position, horizontal dimension, vertical dimension, data entry order, tab order, page breaks, header, footer, etc. for the template.
  • a template may be associated with a category and/or group of templates.
  • an EKG template may be associated with a category, group, folder, and/or sub-directory of cardiology templates.
  • templates Via for example a template browser, templates may be moved from one category and/or group to another, opened, renamed, modified, and/or deleted.
  • access control for one or more templates and/or groups of templates may be specified.
  • FIG. 9 is a diagram of an exemplary embodiment of a user interface 9000 of the present invention.
  • User interface 9000 may display an object, such as template object A 9100 that has been created using a control from toolbox 8100 of FIG. 8 "over" a scanned form image 7010 of FIG. 7.
  • FIG. 10 is a diagram of an exemplary embodiment of a user interface 10000 of the present invention.
  • User interface 10000 may display a list of objects 10100, such as template objects A, B, and C.
  • user interface 10000 may display information 10200 regarding an object selected from list 10100, such as the object name 10300, an associated database field 10400, and/or a data type 10500 (e.g, character, variable- length character, integer, and/or boolean, etc.).
  • a data type 10500 e.g, character, variable- length character, integer, and/or boolean, etc.
  • FIG. 11 is a diagram of an exemplary embodiment of a user interface 11000 of the present invention.
  • User interface 11000 may display textual information 11100 and/or graphical information 11200, such as an image of an anatomical feature, for example, an image of a human body. Textual information 11100 may identify a navigation path through graphical information 11200, for example.
  • FIG. 12 is a diagram of an exemplary embodiment of a user interface 12000 of the present invention.
  • User interface 12000 may display textual information 12100 that overlays and/or is linked to graphical information 12200.
  • user interface 12000 may provide a graphical image of a human heart 12200, areas of which may be labeled via descriptive textual information 12100. Any portion of graphical information 12200 (such as the left ventricle area) and/or textual information 12100 (such as the "left ventricle" label) may be hyperlinked to a detailed image and/or textual information corresponding to that particular portion.
  • FIG. 13 is a diagram of an exemplary embodiment of a user interface 13000 of the present invention.
  • User interface 13000 may include graphical information 13200, such as an image of at least a portion of an arterial system serving a human heart.
  • user interface 13000 may include textual information 13100, such as textual labels of various components of that arterial system (such as, for example, the RCA (right common iliac artery), and the Cx (circumflex coronary artery)).
  • textual information 13100 may also include textual navigational and/or notational information.
  • FIG. 14 is a diagram of an exemplary embodiment of a user interface 14000 of the present invention.
  • User interface 14000 may display textual information 14100 and/or graphical information 14200.
  • Textual information 14100 may communicate, for example, observations, notes, measurements, data, considerations, and/or recommendations regarding an anatomical component and/or feature 14110, an investigated aspect of that anatomical component and/or feature 14120, a diagnosis 14130, an intervention and/or treatment plan 14140, an inter-intervention and/or inter- treatment condition, a post-intervention and/or post-treatment situation 14150.
  • Graphical information 14200 may comprise an image, such as an image of an anatomical component and/or feature of concern 14210, and may include additional graphical information, such as stent 14220, and/or textual information 14160.
  • FIG. 15 is a diagram of an exemplary embodiment of a user interface 15000 of the present invention.
  • User interface 15000 may display textual information 15100 and/or graphical information 15200.
  • Textual information 15100 and/or graphical information 15200 may communicate, for example, observations, notes, measurements, data, considerations, and/or recommendations regarding an anatomical component and/or feature, an investigated aspect of that anatomical component and/or feature, a diagnosis, an intervention and/or treatment plan, an inter-intervention and/or inter- treatment condition, a post-intervention and/or post-treatment situation.
  • Graphical information 15200 also may comprise an image, such as an image of an anatomical component and/or feature of concern, a surgical procedure, and/or medical device (such as a stent).

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Computational Linguistics (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Data Mining & Analysis (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Certains modes de réalisation décrits dans la présente invention concernent un procédé permettant de créer une documentation graphique à travers laquelle il est possible de naviguer et conçue pour être utilisée dans un système d'informations de soins de santé. Le procédé décrit dans l'invention consiste à créer une documentation en aidant un utilisateur à importer un élément d'image graphique depuis un dépôt de données; à décomposer ledit élément en plusieurs segments; à établir des liens entre des segments individuels parmi les multiples segments et un élément d'image graphique d'encapsulation de manière à permettre la navigation dans ledit élément d'image graphique d'encapsulation en réaction à une commande de navigation utilisateur; et à relier un segment d'élément d'image graphique à un objet contenant du texte associé audit segment d'élément d'image graphique. Le procédé décrit dans cette invention peut également comprendre les étapes consistant à associer un nom à ladite documentation, puis à stocker ladite documentation créée en réaction à une commande utilisateur.
EP03716402A 2002-03-16 2003-03-10 Navigation dans un formulaire de soins de sante electronique Withdrawn EP1485855A2 (fr)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US36454002P 2002-03-16 2002-03-16
US364540P 2002-03-16
US10/383,299 US20040008223A1 (en) 2002-03-16 2003-03-07 Electronic healthcare management form navigation
US383299P 2003-03-07
PCT/US2003/007164 WO2003081474A2 (fr) 2002-03-16 2003-03-10 Navigation dans un formulaire de soins de sante electronique

Publications (1)

Publication Number Publication Date
EP1485855A2 true EP1485855A2 (fr) 2004-12-15

Family

ID=31498285

Family Applications (1)

Application Number Title Priority Date Filing Date
EP03716402A Withdrawn EP1485855A2 (fr) 2002-03-16 2003-03-10 Navigation dans un formulaire de soins de sante electronique

Country Status (4)

Country Link
EP (1) EP1485855A2 (fr)
JP (1) JP2005521160A (fr)
CA (1) CA2479387A1 (fr)
WO (1) WO2003081474A2 (fr)

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4945476A (en) * 1988-02-26 1990-07-31 Elsevier Science Publishing Company, Inc. Interactive system and method for creating and editing a knowledge base for use as a computerized aid to the cognitive process of diagnosis
US6405072B1 (en) * 1991-01-28 2002-06-11 Sherwood Services Ag Apparatus and method for determining a location of an anatomical target with reference to a medical apparatus
US5525905A (en) * 1994-11-21 1996-06-11 Picker International, Inc. Patient handling system for use on multiple imaging systems
US5740428A (en) * 1995-02-07 1998-04-14 Merge Technologies, Inc. Computer based multimedia medical database management system and user interface
JP2003526864A (ja) * 2000-03-13 2003-09-09 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ 医用データを管理する管理システム及び方法
WO2002007091A2 (fr) * 2000-07-14 2002-01-24 Haltsymptoms.Com, Inc. Systeme de navigation electronique a travers des informations relatives a des parties du corps

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO03081474A2 *

Also Published As

Publication number Publication date
WO2003081474A8 (fr) 2004-04-01
WO2003081474A2 (fr) 2003-10-02
WO2003081474A3 (fr) 2004-02-12
CA2479387A1 (fr) 2003-10-02
JP2005521160A (ja) 2005-07-14

Similar Documents

Publication Publication Date Title
US7590932B2 (en) Electronic healthcare management form creation
US20050114283A1 (en) System and method for generating a report using a knowledge base
US7742931B2 (en) Order generation system and user interface suitable for the healthcare field
US7877683B2 (en) Self-organizing report
US20030036927A1 (en) Healthcare information search system and user interface
US20060173858A1 (en) Graphical medical data acquisition system
US8150711B2 (en) Generating and managing medical documentation sets
US20110082710A1 (en) Electronic medical record creation and retrieval system
US20040153338A1 (en) Medical information system
US11501858B1 (en) Visual charting method for creating electronic medical documents
US11557384B2 (en) Collaborative synthesis-based clinical documentation
US20040008223A1 (en) Electronic healthcare management form navigation
US20060041836A1 (en) Information documenting system with improved speed, completeness, retriveability and granularity
US20160092347A1 (en) Medical system test script builder
US20080040161A1 (en) Software for generating documents using an object-based interface and item/property data storage with a bulk multimedia import utility
US20060173710A1 (en) System and user interface supporting item ordering for use in the medical and other fields
EP1485855A2 (fr) Navigation dans un formulaire de soins de sante electronique
US20050210044A1 (en) Software for generating documents using an object-based interface and item/property data storage
JP2008117239A (ja) 医療情報処理システム、所見データ編集装置、所見データ編集方法及びプログラム
JP6250925B2 (ja) パス作成支援プログラム、方法、及び装置
JP3333185B1 (ja) 入力枠の属性変更機能を有する電子カルテシステム
US20070033575A1 (en) Software for linking objects using an object-based interface
Wendler et al. Cooperative image workstation based on explicit models of diagnostic information requirements
Williams et al. Database Query Interface for Medical Information Systems

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20040920

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LI LU MC NL PT RO SE SI SK TR

AX Request for extension of the european patent

Extension state: AL LT LV MK

RIN1 Information on inventor provided before grant (corrected)

Inventor name: STEINBERG, TERRI H.

Inventor name: RAO, KIRON

Inventor name: BRITTON, CATHERINE

RIN1 Information on inventor provided before grant (corrected)

Inventor name: STEINBERG, TERRI H.

Inventor name: RAO, KIRON

Inventor name: BRITTON, CATHERINE

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20081001