CN111863179A - Medical information processing apparatus, medical information processing method, and program - Google Patents

Medical information processing apparatus, medical information processing method, and program Download PDF

Info

Publication number
CN111863179A
CN111863179A CN202010330651.2A CN202010330651A CN111863179A CN 111863179 A CN111863179 A CN 111863179A CN 202010330651 A CN202010330651 A CN 202010330651A CN 111863179 A CN111863179 A CN 111863179A
Authority
CN
China
Prior art keywords
event
target
information processing
medical information
disease
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010330651.2A
Other languages
Chinese (zh)
Other versions
CN111863179B (en
Inventor
薄井小百合
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Medical Systems Corp
Original Assignee
Canon Medical Systems Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Medical Systems Corp filed Critical Canon Medical Systems Corp
Publication of CN111863179A publication Critical patent/CN111863179A/en
Application granted granted Critical
Publication of CN111863179B publication Critical patent/CN111863179B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/10ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to drugs or medications, e.g. for ensuring correct administration to patients
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation

Abstract

Embodiments relate to a medical information processing apparatus, a medical information processing method, and a program. Provided are a medical information processing device, a medical information processing method, and a program, which can facilitate the grasping of a patient state. The medical information processing device of an embodiment includes a reception unit, an extraction unit, and a display control unit. The receiving unit receives a selection of a target part or a target disease. The extraction unit extracts a past 1 st event corresponding to the selected target part or the target disease and a past 2 nd event corresponding to a part or a disease associated with the target part or the target disease. The display control unit displays the 1 st event and the 2 nd event by mapping them on a graph.

Description

Medical information processing apparatus, medical information processing method, and program
The present application claims priority based on japanese patent application No. 2019-082625 filed on 24/4/2019, the entire contents of which are incorporated herein by reference.
Technical Field
The present invention relates to a medical information processing apparatus, a medical information processing method, and a program.
Background
In order to grasp the state of the patient, the doctor refers to past events such as the symptoms of the patient, vital data (visual data), medication, examination results, surgery, treatment, diagnosis results, and the like. However, it is not easy to refer to all of these events. In addition, although necessary events can be reduced by an inquiry or the like, in this case, the number of steps until a diagnosis is reached increases, and a work load and time are required.
Disclosure of Invention
The problem to be solved by the present invention is to realize easy grasp of the patient's condition.
The medical information processing device of an embodiment includes a reception unit, an extraction unit, and a display control unit. The receiving unit receives a selection of a target part or a target disease. The extraction unit extracts a past 1 st event corresponding to the selected target part or the target disease and a past 2 nd event corresponding to a part or a disease associated with the target part or the target disease. The display control unit displays the 1 st event and the 2 nd event by mapping them on a graph.
Effect
According to the medical information processing apparatus of the embodiment, the patient state can be easily grasped.
Drawings
Fig. 1 is a block diagram showing an example of the configuration of a medical information processing system according to embodiment 1.
Fig. 2 is a diagram showing an example of the regional medical coordination system according to embodiment 1.
Fig. 3 is a diagram showing an example of event display according to embodiment 1.
Fig. 4 is a diagram showing a display example according to embodiment 1.
Fig. 5 is a diagram showing a display example according to embodiment 1.
Fig. 6 is a diagram showing a display example according to embodiment 1.
Fig. 7A is a diagram showing a display example according to embodiment 1.
Fig. 7B is a diagram showing a display example according to embodiment 1.
Fig. 8 is a flowchart for explaining a series of flows of processing performed by the medical information processing apparatus according to embodiment 1.
Fig. 9A is a diagram showing a display example according to embodiment 2.
Fig. 9B is a diagram showing a display example according to embodiment 2.
Fig. 9C is a diagram showing a display example according to embodiment 2.
Fig. 10 is a diagram showing a display example according to embodiment 2.
Fig. 11 is a flowchart for explaining a series of flows of processing performed by the medical information processing apparatus 100 according to embodiment 2.
Fig. 12 is a diagram showing a display example according to embodiment 3.
Detailed Description
Hereinafter, embodiments of a medical information processing apparatus, a medical information processing method, and a program will be described in detail with reference to the drawings.
In embodiment 1, as shown in fig. 1, a medical information processing system 1 including a medical information processing apparatus 100 and a data storage apparatus 200 will be described as an example. Fig. 1 is a block diagram showing an example of the configuration of a medical information processing system 1 according to embodiment 1.
As shown in fig. 1, the medical information processing apparatus 100 and the data storage apparatus 200 are communicably connected via a network NW. The medical information processing apparatus 100 and the data storage apparatus 200 may be installed in the same facility or in different facilities. That is, the network NW may be a local network closed in a facility or may be a network via the internet.
The medical information processing apparatus 100 is an apparatus that collects events such as symptoms, vital data, medication, examination results, surgery, treatment, and diagnosis results of a patient from inside or outside a facility, and presents the events to a user. The user may be a doctor, or a medical staff such as a nurse, a pharmacist, a clinical laboratory technician, a medical radiology technician, a physical therapist, a speech hearing therapist, or a dental care professional. The medical information processing apparatus 100 is realized by a computer device such as a workstation. Specifically, as shown in fig. 1, the medical information processing apparatus 100 includes an input interface 110, a display 120, a memory 130, and a processing circuit 140.
The input interface 110 receives various input operations from a user, converts the received input operations into electric signals, and outputs the electric signals to the processing circuit 140. For example, the input interface 110 is implemented by a mouse, a keyboard, a trackball, a switch, a button, a joystick, a touch panel for performing an input operation by touching the operation panel, a touch panel in which a display screen and a touch panel are integrated, a non-contact input circuit using an optical sensor, a voice input circuit, or the like.
The input interface 110 may be a tablet terminal or the like that can wirelessly communicate with the medical information processing apparatus 100. The input interface 110 is not limited to a physical operation component including a mouse, a keyboard, and the like. For example, a processing circuit that receives an electric signal corresponding to an input operation from an external input device provided separately from the medical information processing apparatus 100 and outputs the electric signal to the processing circuit 140 is also included in the input interface 110.
The display 120 displays various information. For example, the display 120 displays past or current day events of a patient in the future hospital with characters or images, charts, and the like. Further, for example, the display 120 displays a GUI (Graphical User Interface) for accepting various operations from a User. The display 120 is, for example, a liquid crystal display or a CRT (Cathode Ray Tube) display. The display 120 may be a desktop type, or may be configured by a tablet terminal or the like that can wirelessly communicate with the medical information processing apparatus 100. In addition, the input interface 110 and the display 120 may also be combined. For example, the input interface 110 and the display 120 are implemented by a single touch panel. The display 120 is an example of a display unit.
Further, the medical information processing apparatus 100 may include a plurality of displays 120. For example, the medical information processing apparatus 100 may include 2 physically separated displays (dual displays) as the display 120. Further, these plural displays 120 may also be controlled in a correlated manner. For example, the plurality of displays 120 may be controlled to display 1 continuous region. In this case, the display area of the display 120 is expanded according to the number of displays 120.
The Memory 130 is implemented by, for example, a RAM (Random Access Memory), a semiconductor Memory element such as a flash Memory, a hard disk, an optical disk, or the like. For example, the memory 130 stores a program for each circuit included in the medical information processing apparatus 100 to realize the function thereof.
The processing circuit 140 controls the entire process performed by the medical information processing apparatus 100 by executing the reception function 141, the extraction function 142, and the display control function 143.
For example, the processing circuit 140 reads and executes a program corresponding to the reception function 141 from the memory 130, thereby receiving an input operation from a user. For example, the processing circuit 140 reads a program corresponding to the extracting function 142 from the memory 130 and executes the program, thereby extracting past events related to the patient from the data storage device 200. Further, for example, the processing circuit 140 displays the event extracted by the extraction function 142 on the display 120 by reading out and executing a program corresponding to the display control function 143 from the memory 130. The receiving function 141 is an example of a receiving unit. The extraction function 142 is an example of an extraction unit. The display control function 143 is an example of a display control unit.
In the medical information processing apparatus 100 shown in fig. 1, each processing function is stored in the memory 130 as a program executable by a computer. The processing circuit 140 is a processor that reads and executes programs from the memory 130 to realize functions corresponding to the respective programs. In other words, the processing circuit 140 in a state where the program is read has a function corresponding to the read program. In fig. 1, the reception function 141, the extraction function 142, and the display control function 143 are described as being realized by a single processing circuit 140, but the processing circuit 140 may be configured by combining a plurality of independent processors, and the functions may be realized by executing programs by the respective processors. In addition, the processing functions of the processing circuit 140 may be implemented by being distributed or combined into a single or multiple processing circuits as appropriate.
The data storage device 200 is a storage device that stores past events for a plurality of patients. For example, the data storage device 200 stores, as past events, symptoms, diagnosis results, and other various events that the patient has had in the past. The data storage device 200 is implemented by a computer device such as a DB (database) server, for example, and stores an event in a semiconductor memory element such as a RAM or a flash memory, or a storage circuit such as a hard disk or an optical disk.
Here, examples of past events other than the symptoms and the diagnosis results include life data of a patient measured in the past, a record of administration to the patient, an examination result, a record of surgery and treatment, and the like. For example, the data storage device 200 stores various events in association with a patient ID indicating a patient to be an event target as a past event.
For example, the data storage device 200 is a management server in the regional medical coordination system shown in fig. 2. Fig. 2 is a diagram showing an example of the regional medical coordination system according to embodiment 1. In fig. 2, two physically separated boxes are shown as management servers. That is, in fig. 1, only 1 data storage device 200 is shown, but the medical information processing system 1 may include a plurality of data storage devices 200. The management server may be a so-called cloud server.
For example, the management server collects and stores past events of patients from various systems in hospitals such as "a hospital" or "B hospital" and "C hospital" or in clinics such as "D clinic". Examples of various systems in a Hospital or a medical institution include a Hospital Information System (HIS), a Radiology Information System (RIS), an electronic medical record System, a medical image management System (PACS), and a surgical System.
Further, the management server collects and stores past events of the patient from care systems of care facilities such as "H care facility" and "I care facility". The term "care" may be facility care or visit care. Further, the management server collects and stores past events of the patient from the medication system of the pharmacies such as the "O pharmacy" and the "P pharmacy".
In the case shown in fig. 2, the medical information processing apparatus 100 is installed in any one of "a hospital", "B hospital", "C hospital", "D clinic", "H care facility", "I care facility", "O pharmacy", and "P pharmacy", for example, and is connected to the data storage apparatus 200 as a management server via the network NW. Alternatively, the medical information processing apparatus 100 may be provided in each of a plurality of facilities shown in fig. 2. That is, fig. 1 shows only 1 medical information processing apparatus 100, but the medical information processing system 1 may include a plurality of medical information processing apparatuses 100.
In addition, the management server in the regional medical coordination system may be omitted. In this case, a management server provided in each of the plurality of facilities and managing past events of the facility functions as the data storage apparatus 200. That is, fig. 1 shows only 1 data storage apparatus 200, but the medical information processing system 1 may include a plurality of data storage apparatuses 200 provided in a plurality of facilities.
It is to be noted that the medical information processing system 1 is described as a system in a regional medical cooperation system, but the embodiment is not limited to this. For example, the medical information processing system 1 may be a system closed in 1 hospital. Further, although the data storage apparatus 200 has been described on the assumption that various past events are stored, the memory 130 may store a part or all of the past events.
The overall configuration of the medical information processing system 1 according to embodiment 1 is described above. With this configuration, the medical information processing apparatus 100 in the medical information processing system 1 presents the past events of the patient stored in the data storage apparatus 200 or the memory 130 to the user as appropriate, thereby facilitating the grasping of the patient state. Hereinafter, the process of the medical information processing system 1 according to embodiment 1 will be described in detail.
First, an example of event display will be described with reference to fig. 3. Fig. 3 is a diagram showing an example of event display according to embodiment 1. For example, when a patient is coming to a hospital, the medical information processing apparatus 100 collects events related to the patient coming to the hospital from the data storage apparatus 200, and displays the events in accordance with a time axis. Here, the patient may be a patient at an initial visit or a patient in a hospital after the initial visit.
As an example, the medical information processing apparatus 100 collects past events of each facility included in the regional medical cooperation system from the management server shown in fig. 2. That is, the medical information processing apparatus 100 collects events such as past symptoms, vital data, medication, examination results, surgery, treatment, diagnosis results, and the like of the patient from the management server. Alternatively, the medical information processing apparatus 100 may collect past events of the patient from each facility included in the regional healthcare cooperative system. Then, the medical information processing apparatus 100 creates a table of fig. 3 by arranging the collected past events in time series, and displays the table on the display 120. Note that, in fig. 3, the events on the table are indicated by characters, but the medical information processing apparatus 100 may display a part or all of the events by predetermined icons.
More specifically, the table of fig. 3 is a table in which events are summarized by each date of patient visits to a facility of a hospital or the like. For example, the table of fig. 3 shows that at "9/1/2012," patients with symptoms a11 classified as "headache", symptoms a12 classified as "nausea" and symptoms a13 classified as "edema in the legs" visited "hospital a. The table in fig. 3 shows that the site of the symptom a11 is the "frontal lobe", and the details thereof are "moderate pain". The table in fig. 3 shows "legs" in the part of the symptom a 13.
The symptom a11, symptom a12, and symptom a13 are examples of the event a1 related to the symptom. In the table of fig. 3, 3 columns of symptom a11, symptom a12, and symptom a13 are shown as the columns corresponding to event a1, but the number of these columns may be increased or decreased as appropriate depending on the number of events a 1. The symptoms may be those of the patient's chief complaints or those recognized by a user such as a doctor who makes an inquiry about the patient.
For example, the table of fig. 3 shows that, regarding a patient who visits "hospital a" on "9/1/2012," event a21 classified as "examination" occurred, vital data a31 classified as "blood pressure," examination result a41 classified as "MR examination reservation," and drug administration a61 classified as "pain relief. The column of the diagnosis result a51 of "9/1/2012" in the table of fig. 3 is an empty column.
Here, the vital data a31 and the vital data a32 in fig. 3 are examples of the event A3 regarding the vital data. In the table of fig. 3, 2 columns of vital data a31 and vital data a32 are shown as the columns corresponding to the event A3, but the number of these columns may be increased or decreased as appropriate depending on the number of events A3.
The inspection result a41 and the inspection result a42 in fig. 3 exemplify an event a4 as to the inspection result. In the table of fig. 3, 2 columns of the inspection result a41 and the inspection result a42 are shown as the columns corresponding to the event a4, but the number of these columns may be increased or decreased as appropriate depending on the number of events a 4.
The diagnostic result a51 in fig. 3 is an example of the event a5 as to the diagnostic result. In the table of fig. 3, only 1 column of the diagnosis result a51 is shown as a column corresponding to the event a5, but the number of the columns may be increased or decreased as appropriate depending on the number of the events a 5.
In addition, administration a61 in fig. 3 is an example of event a6 concerning administration. In the table of fig. 3, only 1 column of administration a61 is shown as a column corresponding to event a6, but the number of columns may be increased or decreased as appropriate depending on the number of events a 6.
Note that the event a21, the event a22, and the event a23 in fig. 3 are examples of the event a2 corresponding to various events other than the above-described event a1, event A3, event a4, event a5, and event a 6. For example, event a2 is an event relating to surgery and treatment. In the table of fig. 3, 3 columns of the event a21, the event a22, and the event a23 are shown as columns corresponding to the event a2, but the number of these columns may be increased or decreased as appropriate depending on the number of the events a 2.
Here, according to the table of fig. 3, the user can accept a prompt regarding a time series of past events for the patient who came in the hospital. However, if it is necessary to overlook all events in the past several years, for example, the burden on the user increases. For example, in fig. 3, records totaling 14 days from "9/1/2012" to "10/2018" are displayed. In addition, it is not easy to confirm the events of 14 records in a diagnosis of a patient who visits hospital C on day 12/25 in 2018.
In order to reduce the number of confirmed events, it is conceivable to reduce past events required for diagnosis by, for example, a user performing an inquiry to a patient. For example, fig. 3 shows that a patient with a symptom a11 classified as "chest pain" was admitted to the hospital on "12/25/2018" this day ". Therefore, it is possible to reduce the number of past events on the day when chest pain is manifested as a symptom, as an event required for the diagnosis.
However, actually, even on the day when chest pain appears as a symptom, there are cases where it is not necessary to perform the diagnosis. That is, even if the past chest pain is manifested as a symptom, the current disease may be related to the past disease, or the current disease may be newly developed regardless of the past disease. On the other hand, even on days when chest pain does not appear as a symptom, the event may be an event necessary for the diagnosis of the current time.
Therefore, the user needs to adjust the content of the inquiry appropriately by referring to the table of fig. 3 again after listening to the symptoms from the patient. For example, the user may further inquire of the patient about possible disease names based on the symptoms heard from the patient and the reference result of the table of fig. 3. And, the user narrows down the events required to confirm the diagnosis at this time. Thus, there are multiple steps in the event reduction, which requires a large amount of work and time, and a large burden on the user.
Therefore, the medical information processing apparatus 100 facilitates the grasping of the patient state based on past events by the processing performed by the processing circuit 140, which will be described in detail below. Specifically, first, the receiving function 141 receives selection of a target part or a target disease. Next, the extraction function 142 extracts the past 1 st event corresponding to the selected target site or target lesion and the past 2 nd event corresponding to the associated site or lesion. Next, the display control function 143 maps the extracted 1 st event and 2 nd event on a schema (schema) and displays them.
For example, when the patient is coming to the hospital, first, the display control function 143 causes the diagram C1 shown in fig. 4 to be displayed on the display 120. Specifically, the display control function 143 displays a diagram showing the front and back of a normal human body as a diagram C1, and displays the site D1, the site D2, and the site D3 on the diagram. Fig. 4 is a diagram showing a display example according to embodiment 1. In fig. 4, a case where the receiving function 141 receives selection of a target site will be described as an example.
Here, the site displayed in the diagram C1 may be a site set in advance or a site set according to a patient who is coming to a hospital. For example, the display control function 143 refers to past events stored in the data storage device 200 for a patient who came to a hospital, and acquires a site having a certain symptom in the past, a site diagnosed as having a disease, or the like. Then, the display control function 143 causes the acquired part to be displayed in the diagram C1. Fig. 4 shows, as an example, a case where a region D1 as a "head", a region D2 as a "leg", and a region D3 as a "lung" are displayed.
Next, the receiving function 141 receives a selection of a target site from the user via the diagram C1. For example, the user selects the suspected site and condition based on the patient's reason for hospital admission. Examples of reasons for hospitalization include "xx pain", "exanthema", "no improvement after administration", "abnormality in xx in the result of health diagnosis", "request for post-operation care of xx", "request for care", and the like.
For example, in the case where the patient's reason for admission is "headache", the user inputs an operation of selecting the site D1 in the illustration C1 via the input interface 110. For example, the user inputs an operation of selecting the portion D1 by a click operation using a mouse, a tap operation on a touch panel, or the like. Thus, the reception function 141 receives the selection of the site D1 as the target site.
Next, the extraction function 142 extracts the past 1 st event corresponding to the selected target portion. For example, when the part D1 as the "head" is selected, the extraction function 142 extracts an event corresponding to the "head" from past events stored in the data storage apparatus 200. Here, the event corresponding to "head" includes, for example, an examination result and an operation on the head, a treatment, administration of a disease to the head, a disease example caused by a disease of the head, a diagnosis result of a disease in the head, and the like.
For example, the extraction function 142 creates master data M11 in advance, in which the correspondence relationship between the parts and the events is determined, and stores the master data M11 in the memory 130. The master data M11 can be created based on an input operation by a user such as a doctor. For example, the extraction function 142 receives an input operation from a user via the input interface 110, and generates the master data M11 by associating events such as a ct (Computed Tomography) examination of the heart, a Single Photon Emission Computed Tomography (SPECT) examination of the myocardium, an ultrasound examination of the heart, and blood pressure with a part "circulatory organ".
Alternatively, the master data M11 may be automatically created by the extraction function 142. For example, the extraction function 142 first associates various lesions known to present symptoms with respect to the head with respect to the location "head". Next, the extraction function 142 creates master data M11 by associating events such as examination, operation, treatment, and administration performed in the past for a patient having a disease associated with the part "head", and a disease case possessed by a patient having a disease associated with the part "head". For example, the extraction function 142 creates master data M11 by re-associating an event to be referred to by the user when a patient having a disease associated with the part "head" is hospitalized with the part "head". Alternatively, the extraction function 142 may acquire the master data M11 created in another device via the network NW, for example.
For example, when the accepting function 141 accepts the selection of the "header", the extracting function 142 reads the master data M11 from the memory 130, and extracts an event corresponding to the "header" in the master data M11 as the 1 st event. For example, the extraction function 142 extracts, as the 1 st event corresponding to "head", a record of an operation such as "removal of a malignant tumor (2 cm)", together with information on the date and time. The extraction function 142 extracts examination results such as "no tumor after surgery" and "no abnormality in brain examination" together with date and time information as an event corresponding to "head". Then, the display control function 143 maps the extracted various 1 st events onto the illustration C1 and displays the same.
For example, the display control function 143, as shown in the lead-out box E11 of fig. 5, displays "head tumor", "2012/9/21: malignant tumor (2cm) resection "," 2012/09/30: post-surgery, no tumor "," 2014/09/18: no abnormality in brain examination "," 2 months in 6 years: the 1 st event such as "no abnormality" is displayed in association with the site D1 in the diagram C1. In addition, the display control function 143 displays "headache (during head tumor therapy) 2012/8/01 to 2012/9/25" in association with the site D1 in the diagram C1 as shown in a leading box E12 of fig. 5. Fig. 5 is a diagram showing a display example according to embodiment 1.
Further, extraction function 142 extracts the past 2 nd event corresponding to the part associated with the selected target part. For example, in a case where the part D1 as the "head" is selected, the extraction function 142 first specifies the part associated with the "head". Here, the part related to the "head" is, for example, a part where symptoms occur due to head diseases, a part where symptoms of the head are caused, or the like.
For example, the extraction function 142 creates master data M21 in which the correspondence relationship between the parts is set in advance, and stores the master data M21 in the memory 130. The master data M21 can be created based on an input operation by a user such as a doctor. Alternatively, the master data M21 may be automatically created by the extraction function 142. Alternatively, the extraction function 142 may acquire the master data M21 created in another device via the network NW, for example.
For example, there are cases where a patient has a "head tumor" as a head disease, and cases where "vomiting" as a symptom of digestive organs and "leg edema" as a symptom of legs occur in addition to "headache" as a symptom of the head. Therefore, the extraction function 142 creates master data M21 by associating the parts such as "digestive organs" and "legs" with the parts "heads". For example, when the accepting function 141 accepts selection of "head", the extracting function 142 reads the master data M21 from the memory 130, and specifies a part corresponding to "head" in the master data M21 as a part associated with "head".
For example, the extraction function 142 identifies "legs" as the part associated with "head". Next, the extraction function 142 reads out the master data M11 from the memory 130, and extracts an event corresponding to "leg" as the 2 nd event from the master data M11. For example, the extraction function 142 extracts the symptom of "leg edema" together with information on the date and time as the 1 st event corresponding to "leg".
Then, the display control function 143 maps the extracted 2 nd event to the illustration C1 and displays it. For example, as shown in a drawing-out box E13 of fig. 5, the display control function 143 displays "edema of leg (during head tumor treatment) 2012/9/15 to 2012/9/25" in association with the site D2 as "leg" as the 2 nd event.
In fig. 5, the description is given assuming that the drawn frame E11, the drawn frame E12, and the drawn frame E13 are displayed when the selection of the target site is accepted. However, the embodiment is not limited thereto. For example, the display control function 143 may display the lead frame E11, the lead frame E12, and the lead frame E13 when the predetermined operation indicating the past event of the patient is accepted after the acceptance function 141 accepts the selection of the target site. For example, the display control function 143 may display the drawn frame E11, the drawn frame E12, and the drawn frame E13 when an operation such as a click operation is performed on the diagram C1 after the selection of the target site is received by the receiving function 141.
Note that, although only characters are shown in the drawn frame E11, the drawn frame E12, and the drawn frame E13 in fig. 5, the display control function 143 may display an image, a graph, or the like instead of or in addition to the characters. For example, when a plurality of medical images are collected on the head of the patient, the display control function 143 may cause the lead frame E11 or the lead frame E12 to display thumbnail images of the plurality of medical images.
The display control function 143 may display the 1 st event and the 2 nd event in different forms. For example, the display control function 143 may change the size, thickness, color, and the like of the character between the 1 st event displayed in the drawn frame E11 and the drawn frame E12 and the 2 nd event displayed in the drawn frame E13. For example, the display control function 143 may change the shape, size, and color between the lead frame E11, the lead frame E12, and the lead frame E13.
In fig. 5, the description has been given assuming that the 1 st event is displayed in the lead-out frames E11 and E12, and the 2 nd event is displayed in the lead-out frame E13. That is, in fig. 5, the 1 st event and the 2 nd event are displayed in different areas and explained. However, the embodiment is not limited thereto. That is, the display control function 143 may cause the 1 st event and the 2 nd event to be displayed in the same area.
For example, as shown in fig. 6, the display control function 143 causes "head tumor", "2012/9/21: malignant tumor (2cm) resection "," 2012/09/30: post-surgery, tumor-free "," 2014/09/18: no abnormality in brain examination "," 2 months in 6 years: no abnormalities and "headaches (during head tumor treatment) 2012/8/01-2012/9/25" and "leg edema (during head tumor treatment) 2012/9/15-2012/9/25" as the 2 nd event are shown in the extraction box E21. Fig. 6 is a diagram showing a display example according to embodiment 1.
In fig. 5 and 6, the 2 nd event is automatically displayed together with the 1 st event. However, the embodiment is not limited thereto. For example, the display control function 143 may display the 2 nd event when a request is made from the user after the 1 st event is displayed.
For example, the display control function 143 first causes "head tumor", "2012/9/21: malignant tumor (2cm) resection "," 2012/09/30: post-surgery, tumor-free "," 2014/09/18: no abnormality in brain examination "," 2 months in 6 years: no abnormalities and "headaches (during head tumor treatment) 2012/8/01-2012/9/25" are shown in exit box E31. Further, the display control function 143 displays an icon E32 indicating "associated event". In fig. 7A, the icon E32 is shown in the drawn frame E31, but the position where the icon E32 is displayed is arbitrary. Fig. 7A is a diagram showing a display example according to embodiment 1.
Here, when the user has performed an operation on the icon E32, the display control function 143 displays the lead-out frame E33 as shown in fig. 7B, and causes "edema of the legs (during head tumor treatment) 2012/9/15 to 2012/9/25" as the 2 nd event to be displayed in the lead-out frame E33. The operation on the icon E32 is an operation of selecting the icon E32 by, for example, a click operation using a mouse or a tap operation on a touch panel. Fig. 7B is a diagram showing a display example according to embodiment 1.
In fig. 7B, the description has been given assuming that the 2 nd event is displayed in the drawn frame E33 when the operation on the icon E32 is performed, but the position where the 2 nd event is displayed is arbitrary. For example, the display control function 143 may cause the drawn frame E31 to display the 2 nd event without displaying the drawn frame E33 when an operation on the icon E32 is performed.
Next, an example of the procedure of the processing performed by the medical information processing apparatus 100 will be described with reference to fig. 8. Fig. 8 is a flowchart for explaining a series of flows of processing performed by the medical information processing apparatus 100 according to embodiment 1. Step S102 corresponds to the reception function 141. Steps S103 and S104 correspond to the extraction function 142. Steps S101 and S105 correspond to the display control function 143.
First, the processing circuit 140 causes the diagram C1 shown in fig. 4 to be displayed on the display 120 (step S101). Next, the processing circuit 140 accepts selection of a target site from the doctor (step S102). For example, the processing circuit 140 receives an operation of selecting any one of the part D1, the part D2, and the part D3 in the diagram C1 from the doctor who refers to the diagram C1, and thereby receives the selection of the target part.
Next, the processing circuit 140 extracts the past 1 st event corresponding to the selected target portion (step S103). For example, the processing circuit 140 extracts the 1 st event using the master data M11 read out from the memory 130.
Next, the processing circuit 140 extracts the past 2 nd event corresponding to the part associated with the selected target part (step S104). For example, the processing circuit 140 first specifies a part associated with the selected target part using the master data M21 read from the memory 130. Then, the processing circuit 140 extracts the 2 nd event corresponding to the specified portion using the master data M11. Then, the processing circuit 140 maps the extracted 1 st event and 2 nd event on the graph C1 and displays the graph on the display 120, and the processing is terminated.
In addition, the description has been made on the assumption that the event corresponding to the part is extracted using the master data M11. However, the embodiment is not limited thereto. For example, the extraction function 142 may extract an event corresponding to the part by a machine learning method.
For example, the extraction function 142 first acquires, from the data storage device 200, a reference history indicating events referred to by a doctor for each part. Then, the extraction function 142 inputs the part in the reference history as input-side data and the event as output-side data to the machine learning engine. Thus, the machine learning engine generates the learned model N11 that receives the input of the part and outputs the corresponding event. For example, the machine Learning engine generates the learned model N11 using various algorithms such as Deep Learning (Deep Learning) or neural network (neural network), mathematical (Logistic) regression analysis, nonlinear discriminant analysis, Support Vector Machine (SVM), Random Forest (Random Forest), Naive Bayes (Naive Bayes), and the like. Further, the extraction function 142 causes the generated learned model N11 to be stored in the memory 130. Alternatively, the extraction function 142 may acquire the learned model N11 created in another device via the network NW, for example, and store the acquired model in the memory 130.
When the receiving function 141 receives selection of a target part, the extracting function 142 inputs the selected target part to the learned model N11 read from the memory 130. The learned model N11 receives an input of the target portion and outputs a corresponding event. Thereby, the extraction function 142 extracts the past 1 st event corresponding to the selected target portion.
In addition, the description has been made on the assumption that the site related to the target site for which the selection is accepted by the acceptance function 141 is specified using the master data M21. However, the embodiment is not limited thereto. For example, the extraction function 142 may determine the associated part by a machine learning method.
For example, the extraction function 142 first acquires, from the data storage device 200, part information indicating a plurality of parts of the same patient having a disease. The extraction function 142 executes machine learning assuming that a plurality of parts of the same patient having a disease are related to each other. For example, the extraction function 142 inputs any one of a plurality of parts of the same patient having a disease as input-side data and the other parts as output-side data to the machine learning engine. Thus, the machine learning engine generates the learned model N21 that receives the input of the part and outputs the associated part. For example, the machine learning engine generates the learned model N21 using various algorithms of deep learning or neural networks, mathematical regression analysis, nonlinear discriminant analysis, support vector machines, random forests, naive bayes, and the like. Further, the extraction function 142 causes the generated learned model N21 to be stored in the memory 130. Alternatively, the extraction function 142 may acquire the learned model N21 created in another device via the network NW, for example, and store the acquired model in the memory 130.
When the receiving function 141 receives selection of a target part, the extracting function 142 inputs the selected target part to the learned model N21 read from the memory 130. The learned model N21 receives an input of a target part and outputs a relevant part. Thereby, the extraction function 142 specifies the part associated with the selected target part. Further, extraction function 142 inputs the specified portion to learned model N11 read from memory 130. The learned model N11 receives an input of the target portion and outputs a corresponding event. Thereby, the extraction function 142 extracts the past 2 nd event corresponding to the part associated with the selected target part. Here, the extraction function 142 may also extract the 2 nd event using the master data M11.
In addition, although the selection of the target site has been described above on the assumption that the selection of the target site is received by receiving the operation of diagram C1, the embodiment is not limited thereto. For example, the receiving function 141 may receive selection of a target part by character input. As an example, the doctor enters the "head" using a keyboard or touch panel characters. Thus, the reception function 141 receives selection of the site D1 as the "head" as the target site.
In addition, although the description has been made on the assumption that the selection of the target site is received, the embodiment is not limited to this. For example, the receiving function 141 may receive the selection of the target disease instead of the target site. For example, the physician enters the name of the condition via the input interface 110 based on the patient's reason for admission. For example, the doctor uses a keyboard or a touch panel to input a character "head tumor" as a target lesion.
When the receiving function 141 receives selection of a target lesion, the extracting function 142 extracts the past 1 st event corresponding to the selected target lesion. For example, the extracting function 142 creates master data M12 in advance in which the correspondence between the lesion and the event is set, and stores the master data M12 in the memory 130. The master data M12 can be created based on an input operation by a user such as a doctor. Alternatively, the master data M12 may be automatically created by the extraction function 142. Alternatively, the extraction function 142 may acquire the master data M12 created in another device via the network NW, for example. For example, when the accepting function 141 accepts selection of "head tumor", the extracting function 142 reads the master data M12 from the memory 130, and extracts an event corresponding to "head tumor" as the 1 st event from the master data M12.
For example, the extraction function 142 creates a learned model N12 that receives an input of a lesion in advance and outputs a corresponding event, and stores the model in the memory 130. Alternatively, the extracting function 142 may acquire the learned model N12 created by another device and store the acquired model in the memory 130. For example, when the receiving function 141 receives a selection of "head tumor", the extracting function 142 inputs the selected target lesion to the learned model N12 read from the memory 130. The learned model N12 receives an input of a target lesion and outputs a corresponding event. Thereby, the extraction function 142 extracts the past 1 st event corresponding to the selected target lesion.
Then, the display control function 143 maps the extracted various 1 st events onto the illustration C1 and displays the same. For example, the display control function 143 displays the 1 st event extracted assuming correspondence with "head tumor" in correspondence with the site D1 as "head" on the diagram C1.
When the receiving function 141 receives selection of a target lesion, the extracting function 142 specifies a site or a lesion associated with the selected target lesion. For example, the extraction function 142 first creates master data M22 in which the correspondence relationship between the lesions is set and master data M23 in which the correspondence relationship between the parts and the lesions is set in advance, and stores them in the memory 130. The master data M22 and the master data M23 may be created based on an input operation by a user such as a doctor. Alternatively, the master data M22 and the master data M23 may be automatically created by the extraction function 142. Alternatively, the extraction function 142 may acquire the master data M22 and the master data M23 created in another device via the network NW, for example.
Here, for example, when the accepting function 141 accepts selection of "head tumor", the extracting function 142 reads the master data M22 from the memory 130, and specifies a disease associated with "head tumor" in the master data M22. The extraction function 142 reads out the master data M23 from the memory 130, and specifies a region corresponding to the "head tumor" in the master data M23 as a region associated with the "head tumor".
Alternatively, the extraction function 142 creates a learned model N22 that receives an input of a lesion and outputs a related lesion and a learned model N23 that receives an input of a lesion and outputs a related site in advance, and stores them in the memory 130. Alternatively, the extracting function 142 may acquire the learned model N22 and the learned model N23 created in another device and store them in the memory 130.
Here, for example, when the receiving function 141 receives a selection of "head tumor", the extracting function 142 inputs the selected target lesion to the learned model N22 read out from the memory 130. The learned model N22 receives an input of a target lesion and outputs an associated lesion. Thus, the extraction function 142 identifies a lesion associated with the target lesion. Further, the extraction function 142 inputs the selected target lesion to the learned model N23 read out from the memory 130. The learned model N23 receives an input of a target lesion and outputs a relevant part. Thus, the extraction function 142 specifies a site associated with the target disease.
Next, the extraction function 142 extracts the 2 nd event corresponding to the specified part or disease. For example, the extraction function 142 extracts the 2 nd event corresponding to the specified part or lesion, using the master data M11 in which the correspondence relationship between the part and the event is set and the master data M12 in which the correspondence relationship between the lesion and the event is set. Alternatively, the extraction function 142 extracts the 2 nd event corresponding to the specified site or lesion using the learned model N11 that receives the input of the site and outputs the corresponding event and the learned model N12 that receives the input of the lesion and outputs the corresponding event.
In addition, the description has been made assuming that, when the selection of the target part is accepted by the accepting function 141, the extracting function 142 extracts the past 2 nd event corresponding to the part associated with the selected target part. However, the extraction function 142 may extract the past 2 nd event corresponding to the lesion associated with the selected target site as the 2 nd event.
For example, when the receiving function 141 receives a selection of a target part, the extracting function 142 extracts the 1 st event in the past corresponding to the selected target part using the master data M11 in which the correspondence relationship between parts and events is set or the learned model N11 that outputs the input of the received part and outputs the corresponding event. Next, the extraction function 142 specifies a part or a lesion associated with the target part using the master data M21 in which the correspondence relationship between parts is set and the master data M23 in which the correspondence relationship between parts and lesions is set.
Alternatively, the extraction function 142 creates a learned model N24 of the lesion related to the input to the receiving site in advance and outputs it, or acquires a learned model N24 created in another device in advance and stores it in the memory 130. Then, the extraction function 142 identifies the part or lesion associated with the target part using the learned model N21 for receiving the input of the part and outputting the associated part and the learned model N24 for receiving the input of the part and outputting the associated lesion.
The extraction function 142 extracts the 2 nd event corresponding to the specified part or disease. For example, the extraction function 142 extracts the 2 nd event corresponding to the specified part or lesion, using the master data M11 in which the correspondence relationship between the part and the event is set and the master data M12 in which the correspondence relationship between the lesion and the event is set. Alternatively, the extraction function 142 extracts the 2 nd event corresponding to the specified site or lesion using the learned model N11 that receives the input of the site and outputs the corresponding event and the learned model N12 that receives the input of the lesion and outputs the corresponding event.
In addition, the learned model N11 and the learned model N12 described above may be combined. For example, the extraction function 142 generates a learned model N13 of the receiving site or lesion input in advance and outputs a corresponding event, and stores the model in the memory 130. The extraction function 142 extracts the past 2 nd event corresponding to the part or the disease associated with the target part or the target disease for which the selection has been accepted by the acceptance function 141, using the learned model N13.
In addition, the learned model N21, the learned model N22, the learned model N23, and the learned model N24 described above may be combined. For example, the extracting function 142 generates an input of a receiving part or lesion in advance, outputs a learned model N25 of the relevant part or lesion, and stores it in the memory 130. The extraction function 142 specifies a part or a disease associated with the target part or the target disease for which the reception function 141 has received the selection, using the learned model N25.
Further, the learned model N13 and the learned model N25 described above may be combined. For example, the extraction function 142 generates an input of a receiving part or lesion in advance, outputs an event corresponding to the input part or lesion and an event corresponding to the part or lesion associated with the input part or lesion, and stores the learned model N31 in the memory 130. The extraction function 142 inputs the target part or the target lesion selected by the reception function 141 to the learned model N31, identifies a part or a lesion related to the target part or the target lesion, and extracts the 1 st event and the 2 nd event.
The extraction function 142 may extract the 1 st event and the 2 nd event in consideration of the amount of information referred to by the user. For example, the extraction function 142 may set a threshold for the number of events 1 and 2 extracted. That is, depending on the patient, the number of the 1 st event and the 2 nd event to be extracted may be large, and it may be complicated to display all of them on the display 120. Therefore, the extracting function 142 extracts a predetermined number of events having higher priority levels as the 1 st event and the 2 nd event, for example, among past events corresponding to the selected target part or target lesion and past events corresponding to parts or lesions associated with the selected target part or target lesion.
The priority of the event may be set, for example, according to the date and time of each event. For example, the fetch function 142 sets a higher priority for newer events. For example, the priority of the event may be set according to the content of each event. For example, the extraction function 142 sets a higher priority for "unacceptable pain" than for "moderate pain".
The extraction function 142 may extract the 1 st event and the 2 nd event according to the size of the display area of the 1 st event and the 2 nd event. That is, when the number of extracted 1 st events and 2 nd events is large, the entire events may not be displayed on the display 120, or the display size of each event may be reduced if the entire events are displayed on the display 120. On the other hand, if the display areas of the 1 st event and the 2 nd event are large, the events may be displayed in a sufficient size even if the number of the 1 st event and the 2 nd event is large. Therefore, the extraction function 142 is configured to extract an event having a higher priority from the past events corresponding to the selected target site or target lesion and the past events corresponding to the site or lesion associated with the selected target site or target lesion, for example, in a range displayable in the display area of the display 120.
As described above, according to embodiment 1, the receiving function 141 receives selection of a target part or a target disease. Further, the extraction function 142 extracts the past 1 st event corresponding to the selected target part or target lesion and the past 2 nd event corresponding to the part or lesion associated with the selected target part or target lesion. The display control function 143 maps the extracted 1 st event and 2 nd event on the graph and displays them. Therefore, the medical information processing apparatus 100 relating to embodiment 1 can facilitate grasping of the patient state based on past events. Further, the medical information processing apparatus 100 can efficiently perform a series of processing procedures until a desired disease name is considered and necessary examination and treatment are studied.
For example, when the reason for the patient's arrival is "headache", the past events other than the head are not necessary for the diagnosis of this time in many cases. On the other hand, all of the past events other than the head are not necessary, and some of them may be useful in the diagnosis of this time. In contrast, the medical information processing apparatus 100 extracts and displays the 1 st event and the 2 nd event, thereby preventing the extraction from overflowing and reducing the amount of information of the event presented to the doctor. Further, the medical information processing apparatus 100 maps and displays the 1 st event and the 2 nd event on the graph, so that the doctor can intuitively understand the past events, and the grasping of the patient state can be facilitated.
Further, the medical information processing apparatus 100 can perform diagnosis based on past events of the patient.
That is, the medical information processing apparatus 100 can present more information to the doctor and make a highly accurate diagnosis than the case of making a diagnosis based only on the symptoms of the patient who came to the hospital on the day and the examination results performed on the day. In addition, in the case of performing diagnosis based on the symptoms or examination results of the day, there are cases where the information used in the diagnosis is insufficient and the procedure is observed. In contrast, the medical information processing apparatus 100 can additionally present past events of the patient, thereby enabling diagnosis on the same day, avoiding unnecessary process observation, and enabling required treatment and operation to be performed more quickly.
In embodiment 1 described above, the sites (sites D1 to D3) in the diagram C1 are displayed in the same manner. In contrast, in embodiment 2, a case will be described in which the color of the part in the diagram C1 is changed according to the date designated by the user.
The medical information processing system 1 according to embodiment 2 has the same configuration as the medical information processing system 1 shown in fig. 1, and is partially different in the processing performed by the reception function 141 and the display control function 143. Points having the same configurations as those described in embodiment 1 are given the same reference numerals as those in fig. 1, and description thereof is omitted.
For example, when the patient is in the hospital, the display control function 143 first causes the screen shown in fig. 9A to be displayed on the display 120. Specifically, the display control function 143 collects events concerning the patient who came from the hospital from the data storage apparatus 200, and displays the events in the region R1 in fig. 9A in accordance with the time axis. That is, the display control function 143 creates a table by arranging the collected past events in time series, as in fig. 3, and displays the table in the region R1 in fig. 9A. Fig. 9A is a diagram showing a display example according to embodiment 2.
Further, the display control function 143 causes the illustration C1 to be displayed in the region R2 of fig. 9A. For example, the display control function 143 displays a diagram showing the front and back of a normal human body as a diagram C1, and displays a part D1, a part D2, and a part D3 on the diagram, as in fig. 4.
Further, the display control function 143 causes the ultrasonic image I1 collected from the patient in the past to be displayed in the region R3 of fig. 9A. The region R3 may be omitted.
In addition, an image other than the ultrasonic image I1 may be displayed in the region R3. For example, the display control function 143 displays medical images of the type such as X-ray images, CT images, pet (positional emission tomography), SPECT images, and mri (magnetic Resonance imaging) images, which have been collected from a patient in a hospital in the past, in the region R3 in accordance with an input operation from a doctor. The display control function 143 may display an image indicating an analysis result based on a past medical image of the patient in the region R3. For example, the display control function 143 causes a perfusion image representing the result of perfusion analysis for calculating the myocardial blood flow based on a myocardial contrast CT image of the patient to be displayed in the region R3.
Further, the region R3 may display information other than images. For example, the display control function 143 causes the region R3 to display a graph in which the record of administration (the type of the drug to be administered, the dose amount, and the like) is associated with the time axis. The display control function 143 causes, for example, records (medical record, nursing record, and the like) created by a doctor, a nurse, or the like for a patient in the past to be displayed in the area R3.
For example, the display control function 143 causes the region R3 to display a graph in which the past measured vital data is associated with the time axis. Examples of the vital data include a pulse rate, a heart rate, a respiration rate, a blood pressure, a body temperature, and a percutaneous arterial blood oxygen saturation level (SpO2) measured in the past. Here, the vital data may be wearable device data measured by a wearable sphygmomanometer or the like. The type of chart may be appropriately changed. For example, the display control function 143 displays a bar graph, a line graph, or the like in the region R3 as a graph in which the vital data is associated with the time axis. The display control function 143 may display another graph such as a histogram instead of the graph in which the vital data is associated with the time axis.
In fig. 9A, only 1 region R3 is shown, but the display control function 143 may display a plurality of regions R3. For example, the display control function 143 causes the display 120 to display, as the plurality of regions R3, a window indicating a chart of a medication administration record, a window indicating a medical record, a window indicating a nursing record, a window indicating a chart of a medication administration record, a window indicating a chart of a pulse rate, a window indicating a chart of a blood pressure, and a window indicating the ultrasound image I1.
Here, the user referring to the screen of fig. 9A can select a target part or a target lesion. For example, when the receiving function 141 receives the selection of the part D1 as the "head" as the target part, the extracting function 142 extracts the past 1 st event corresponding to the "head" and the past 2 nd event corresponding to the part or the lesion associated with the "head". The display control function 143 maps the 1 st event and the 2 nd event onto the diagram C1 and displays them as indicated by a lead-out frame E11, a lead-out frame E12, and a lead-out frame E13 in fig. 9B. Fig. 9B is a diagram showing a display example according to embodiment 2.
Here, the user referring to the screen of fig. 9A can specify the date. When the date designation is accepted from the user, the display control function 143 changes the display mode so that the patient information on the designated date can be easily grasped.
For example, the user specifies the date by performing an operation on the table of the region R1. For example, the user designates date "2012/9/30" in the region R1 of fig. 9C by a click operation using a mouse, a tap operation with respect to a touch panel, or the like. The reception function 141 receives a date designation by the user. Then, the display control function 143 changes the color of the part displayed on the illustration C1 in accordance with the specified date, as shown in a region R2 in fig. 9C. Fig. 9C is a diagram showing a display example according to embodiment 2.
For example, the display control function 143 changes the color of the part displayed on the diagram C1 according to the presence or absence of the disease in each part and the treatment state at the time of the specified date. For example, at the time "2012/9/30", the operation of removing the malignant tumor of the head is completed, and the symptoms such as headache and leg edema are also eliminated or alleviated. Therefore, the display control function 143 displays each part on the solution C1 with a color indicating that there is no disease or a color indicating that treatment has been performed.
Here, the reception function 141 can also receive a date designation by the user. Then, the display control function 143 updates the color of the portion displayed on the illustration C1 in accordance with the newly specified date. For example, when "2012/9/20" is further designated, the operation of removing the head malignant tumor is not performed at that time, and the patient has symptoms such as headache and leg edema. Therefore, the display control function 143 displays, for example, the part D1 as the "head" and the part D3 as the "leg" in colors indicating that the treatment is not completed, and displays the other parts in colors indicating that there is no disease or colors indicating that the treatment is completed.
For example, the display control function 143 displays the part without the disease and the part already treated in blue, and displays the part not yet treated in red. Here, the display control function 143 may display each region in two colors of blue and red, or may display each region in a shade from blue to red. For example, the display control function 143 displays the treatment-incomplete part with a color mixture in which the ratio of blue to red is adjusted according to the degree of progress of the treatment. In addition, blue and red are exemplified, but the selection of the color may be changed as appropriate.
In addition, although the case where the color of the part is changed according to the treatment state has been described, the embodiment is not limited to this. For example, the display control function 143 may change the color of the part displayed on the diagram C1 according to the presence or absence and the severity of the disease in each part at the time of the specified date. For example, when "2012/9/20" is designated, the operation of removing the head malignant tumor is not performed at that time, and the patient has symptoms such as headache and leg edema. Here, although edema is a symptom appearing on the legs, this is a symptom accompanied by a head tumor, and the severity of "legs" is lower than that of "head". Therefore, the display control function 143 displays, for example, a color indicating a high degree of severity for the part D1 that is the "head", and displays a color indicating no disease or a color indicating a low degree of severity for the part D3 that is the "leg" and other parts. For example, the display control function 143 displays the region D1 in red, and displays the region D3 and other regions in blue.
In addition, the display control function 143 can change the color of the portion by various methods. For example, the display control function 143 may change the color of the portion displayed on the graph C1 according to the elapsed time after the healing at the time of the specified date. For example, the display control function 143 may change the color of the region in consideration of a plurality of pieces of information such as the severity and treatment state of each region at the time of the specified date and time, and the elapsed time after the cure.
In addition, the reception function 141 has been described so far with the assumption that it receives selection of a target site or a target disease and further receives specification of a date, but the embodiment is not limited thereto. That is, the receiving function 141 can receive the designation of the date even when the selection of the target site and the target lesion is not received.
For example, when a patient is in the hospital, the display control function 143 does not accept selection of a target part or a target disease by the user, and various events of the patient in the hospital are displayed by being mapped on the diagram C1.
For example, as shown in fig. 10, the display control function 143 displays "head tumor", "2012/9/21: malignant tumor (2cm) resection "," 2012/09/30: post-surgery, tumor-free "," 2014/09/18: no abnormality in brain examination "," 2 months in 6 years: various events such as "no abnormality" are associated with the site D1 as the "head", and are displayed in the drawn frame E11. The display control function 143 associates "headaches (during head tumor therapy) 2012/8/01 to 2012/9/25" with the site D1 as the "head", and displays the result in the lead frame E12. The display control function 143 causes the "leg edema (during head tumor treatment) 2012/09/15 to 2012/09/25" to be associated with the region D2 as the "leg" and displays the result in the lead frame E12. In addition, the display control function 143 causes the "emphysema", "2017/10/05-2017/10/25: 20 days of treatment "," 2017/10/31: postoperative, tumor-free "," 1 year: various events such as "no abnormality" are associated with the site D3 as "lung", and are displayed in the drawing frame E14. In addition, the display control function 143 displays "stomach cancer", "2018/10/5: malignant tumor (3cm) resection "," 2018/10/10: after the operation, various events such as "no tumor" are associated with the site D4 as "stomach", and are displayed in the drawing frame E15. Fig. 10 is a diagram showing a display example according to embodiment 2.
In addition, when the selection of the target part or the target lesion by the user is not accepted, the total number of events displayed on the display 120 increases. Therefore, the extraction function 142 may extract an event in consideration of the amount of information referred to by the user. For example, the extraction function 142 extracts a prescribed number of events with higher priority among past events of patients who come to a hospital. Further, the extraction function 142 extracts, for example, higher-priority events among past events of a patient coming to a hospital in a range displayable in the display area of the display 120. The display control function 143 maps the event extracted by the extraction function 142 to the illustration C1 and displays the event.
The reception function 141 receives a date designation by the user, and the display control function 143 changes the color of the portion displayed on the diagram C1 in accordance with the designated date. For example, the display control function 143 changes the colors of the part D1, the part D2, the part D3, and the part D4 displayed on the diagram C1 in accordance with the severity and the treatment state at the time of the specified date, the elapsed time after the cure, and the like.
In fig. 10, the description is given assuming that past events of the patient are displayed in the lead frames E11 to E15, but the display of these lead frames may be omitted. That is, the display control function 143 may not display the patient's past events on the illustration C1.
Next, an example of the procedure of the processing performed by the medical information processing apparatus 100 will be described with reference to fig. 11. Fig. 11 is a flowchart for explaining a series of flows of processing performed by the medical information processing apparatus 100 according to embodiment 2. Step S202 and step S207 correspond to the reception function 141. Steps S203 and S204 correspond to the extraction function 142. Steps S201, S205, S206, and S208 correspond to the display control function 143.
First, the processing circuit 140 causes the diagram C1 to be displayed on the display 120 (step S201). Next, the processing circuit 140 determines whether or not selection of the target part or the target lesion is accepted from the user (step S202). Here, when the selection of the target part or the target lesion is accepted (yes in step S202), the processing circuit 140 extracts the past 1 st event corresponding to the selected target part or the target lesion (step S203). Next, the processing circuit 140 extracts the past 2 nd event corresponding to the part or lesion associated with the selected target part or target lesion (step S204). Then, the processing circuit 140 maps the extracted 1 st event and 2 nd event on the graph C1 and displays them on the display 120 (step S205).
On the other hand, if the selection of the target part or the target lesion is not received (yes in step S202), the processing circuit 140 maps various events of the patient at home on the graph C1 and displays the events on the display 120 (step S206). The processing of step S201, step S202, step S203, step S204, and step S205 may be omitted.
After step S205 or step S206, the processing circuit 140 determines whether or not the date designation is accepted from the user (step S207). Here, when the date designation is accepted (yes in step S207), the processing circuit 140 changes the color of the portion displayed on the diagram C1 in accordance with the designated date (step S208), and the process proceeds to step S207 again. When the date designation is not received from the user (no in step S207), the processing circuit 140 ends the processing.
As described above, according to embodiment 2, the reception function 141 receives the date designation. The display control function 143 changes the color of the portion displayed on the graph according to the specified date. Therefore, the medical information processing apparatus 100 relating to embodiment 2 can facilitate grasping of the patient state on the specified date. For example, the user can intuitively grasp the state of the patient on the designated date by referring to the color of each part on the graph.
Further, the medical information processing apparatus 100 updates the color of the part displayed on the graph every time the date is specified. Therefore, the user can intuitively grasp the change in the state of the patient by appropriately changing the specified date while referring to the change in the color of the part displayed on the graph.
For example, the user can intuitively grasp the change in the state of the patient while referring to the change in the color of the part displayed on the graph by appropriately changing the specified date during the execution period of the radiotherapy. For example, the user can intuitively grasp the effect of radiotherapy (disappearance of cancer, degree of reduction, or the like) on a cancer patient, the presence or absence of further metastasis, and the like based on the change in the color of a part. Further, the user can plan the treatment to be performed in the future more easily.
Incidentally, although the embodiments 1 to 2 have been described so far, the embodiments other than the above-described embodiments may be implemented in various forms.
In the above-described embodiment, the 1 st event and the 2 nd event are displayed while being mapped to 1 graph. However, the embodiment is not limited thereto. That is, the display control function 143 may map the 1 st event and the 2 nd event onto a plurality of diagrams and display them.
For example, the display control function 143 causes a table in which past events of patients in a future hospital are arranged in time series to be displayed on the display 120, and an illustration to be displayed for each date of the table, as shown in fig. 12. Specifically, the display control function 143 displays the diagram C2 for "2012/9/1", the diagram C3 for "2012/9/5", and the diagram C4 for "2012/9/10" in the pair. Fig. 12 is a diagram showing a display example according to embodiment 3.
Next, the receiving function 141 receives a selection of a target part or a target disease from the user. For example, the receiving function 141 receives a click operation or a tap operation for any one of the diagram C2, the diagram C3, and the diagram C4, and thereby receives a selection of a target portion. The receiving function 141 receives a character input from a user, for example, and thereby receives selection of a target part or a target lesion. Further, the extraction function 142 extracts the past 1 st event corresponding to the selected target site or target disease and the past 2 nd event corresponding to the site or disease associated with the target site or target disease.
Next, the display control function 143 maps the extracted 1 st event and 2 nd event onto a plurality of graphs and displays them according to the date of each event. For example, the display control function 143 displays the event "2012/9/1" of the extracted 1 st event and 2 nd event by mapping it on the graph C2, the event "2012/9/5" by mapping it on the graph C3, and the event "2012/9/10" by mapping it on the graph C4.
For example, in the case where the extracted event 1 and event 2 include measurement of blood pressure "2012/9/1", the display control function 143 causes "blood pressure" to be displayed on the graph C2 in fig. 12. Here, the display control function 143 may also cause the measured value of blood pressure "130" to be displayed on the diagram C2.
Similarly, when the extracted event 1 and event 2 include measurement of the blood pressure of "2012/9/5", the display control function 143 causes "blood pressure" to be displayed on the graph C3 of fig. 12. In addition, in the case of the MRI examination of the head including "2012/9/5" in the extracted event 1 and event 2, the display control function 143 displays "MRI examination" in association with the head on the graph C3 of fig. 12. Here, the display control function 143 may also display an MRI image collected by an MRI examination or a thumbnail thereof on the diagram C3. Note that, in the case where the extracted 1 st event and 2 nd event include the measurement of the blood pressure of "2012/9/10", the display control function 143 may display the "blood pressure" on the graph C4.
Here, the display control function 143 may change the color of a portion displayed on the diagram according to the dates associated with the respective diagrams. For example, the display control function 143 changes the color of each of the parts displayed on the diagram C2 in accordance with the degree of severity at the time of "2012/9/1". The display control function 143 changes the color of each of the parts displayed on the diagram C3 in accordance with the degree of severity at the time of "2012/9/5". The display control function 143 changes the color of each of the parts displayed on the diagram C4 in accordance with the degree of severity at the time of "2012/9/10".
In the above-described embodiment, it is assumed that the 1 st event and the 2 nd event extracted are mapped on a graph and displayed. However, the embodiment is not limited thereto. For example, the display control function 143 may prepare a table by arranging the extracted 1 st event and 2 nd event in time series and display the table on the display 120. In addition, the display control function 143 can display the extracted 1 st event and 2 nd event in an arbitrary form time series. For example, the display control function 143 may display the extracted 1 st event and 2 nd event on the display 120 as a timeline.
For another example, the display control function 143 first creates a table by time-series arrangement of past events for patients in the future hospital as shown in fig. 3, and displays the table on the display 120. Here, the display control function 143 displays the extracted 1 st event and 2 nd event and other events in different forms in the displayed table. For example, the display control function 143 displays the 1 st event and the 2 nd event among the events displayed in the table in bold characters, or displays the characters or the background color different from the other events.
Similarly, in the case of performing time-series display other than the schedule, the display control function 143 can display the 1 st event and the 2 nd event in a different form from the other events. For example, the display control function 143 first causes past events for patients in the future to be displayed on the display 120 in a timeline arranged in a time series. Here, the display control function 143 causes the extracted 1 st event and 2 nd event and other events to be displayed in different forms in the displayed timeline.
In the above-described embodiment, the case where the 1 st event and the 2 nd event are extracted upon receiving selection of a target site or a target disease has been described. However, the embodiment is not limited thereto. For example, the medical information processing apparatus 100 may receive an event selection and extract a corresponding part or disease.
For example, the function 141 first receives a selection of "measurement of blood pressure" as an event. For example, the reception function 141 receives an operation on a blood pressure table displayed on the display 120, and thereby receives a selection of "measurement of blood pressure". Next, the extracting function 142 extracts a part or a disease corresponding to the "measurement of blood pressure". For example, the extraction function 142 extracts a circulatory organ such as the heart as a region corresponding to "measurement of blood pressure". For example, the extraction function 142 extracts a disease of the circulatory organ as a disease corresponding to "measurement of blood pressure". The display control function 143 causes the extracted part or lesion to be displayed on the display 120. For example, the display control function 143 displays the extracted part or lesion on a graph.
For another example, first, the receiving function 141 receives a selection of "headache" as an event. For example, the reception function 141 receives an operation of inputting a character from the user, and thereby receives a selection of "headache". Alternatively, the reception function 141 may automatically acquire "headache" based on the reason for the hospital to be received by the system such as HIS. Next, the extracting function 142 extracts a part or disease corresponding to the "headache". For example, the extraction function 142 extracts the head as a part corresponding to "headache". For example, the extracting function 142 extracts a head tumor as a disease corresponding to "headache". The display control function 143 causes the extracted part or lesion to be displayed on the display 120. For example, the display control function 143 displays the extracted part or lesion on a graph.
For another example, the reception function 141 first receives a selection of "abnormality in blood pressure value" as an event. For example, the display control function 143 displays the blood pressures of the patient sequentially measured by the wearable sphygmomanometer in time series by a graph or the like. Here, the user such as a doctor or a nurse who refers to the time-series display of the blood pressure inputs "abnormality of the blood pressure value" when the blood pressure exceeds the threshold value, and the reception function 141 receives selection of "abnormality of the blood pressure value". Alternatively, the reception function 141 may automatically acquire "abnormality in blood pressure value" from a system such as an HIS. Next, the extracting function 142 extracts a part or disease corresponding to the "abnormality of blood pressure value". For example, the extraction function 142 extracts a circulatory organ such as the heart as a region corresponding to "abnormality in blood pressure value". For example, the extraction function 142 extracts a disease of the circulatory organ as a disease corresponding to "measurement of blood pressure". The display control function 143 causes the extracted part or lesion to be displayed on the display 120. For example, the display control function 143 displays the extracted part or lesion on a graph.
Here, the extraction function 142 may further extract the past 1 st event corresponding to the extracted part or disease and the past 2 nd event corresponding to the part or disease associated with the extracted part or disease. In this case, the display control function 143 can also cause the 1 st event and the 2 nd event to be displayed mapped on the diagram.
The term "processor" used in the above description refers to, for example, a Circuit such as a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), or an Application Specific Integrated Circuit (ASIC), a Programmable Logic Device (e.g., a Simple Programmable Logic Device (SPLD)), a Complex Programmable Logic Device (CPLD), or a Field Programmable Gate Array (FPGA)). The processor realizes a function by reading out and executing a program stored in the storage circuit 130.
In fig. 1, the description has been given assuming that a single memory 130 stores programs corresponding to respective processing functions. However, the embodiment is not limited thereto. For example, a plurality of memories 130 may be distributed, and the processing circuit 140 may read the corresponding program from each memory 130. Instead of storing the program in the memory circuit 130, the program may be directly loaded into the circuit of the processor. In this case, the processor realizes the function by reading out and executing a program loaded in the circuit.
Further, the processing circuit 140 may also realize functions by a processor of an external device connected via a network. For example, the processing circuit 140 reads out and executes a program corresponding to each function from the memory 130, and realizes each function shown in fig. 1 using a server group (cloud) connected to the medical information processing apparatus 100 via a network as a computing resource.
The components of the devices according to the embodiments described above are functionally conceptual, and need not necessarily be physically configured as shown in the drawings. That is, the specific form of distribution/integration of the respective devices is not limited to the form shown in the drawings, and all or a part thereof may be configured to be functionally or physically distributed/integrated in arbitrary units according to various loads, use situations, and the like. Further, all or any part of the processing functions performed by the respective devices may be realized by a CPU and a program analyzed and executed by the CPU, or may be realized as hardware of a by-wire logic.
The medical information processing method described in the above-described embodiment can be realized by executing a program prepared in advance by a computer such as a personal computer or a workstation. The program may be distributed via a network such as the internet. The program may be recorded in a non-transitory computer-readable recording medium such as a hard disk, a Flexible Disk (FD), a CD-ROM, an MO, or a DVD, and may be read from the recording medium by a computer and executed. For example, the program can be incorporated into electronic medical record creation software.
According to at least 1 embodiment described above, the grasping of the patient state can be facilitated.
Several embodiments of the present invention have been described, but these embodiments are presented as examples and are not intended to limit the scope of the invention. These embodiments may be implemented in other various forms, and various omissions, substitutions, and changes may be made without departing from the spirit of the invention. These embodiments and modifications thereof are included in the scope and gist of the invention, and are also included in the invention described in the claims and the equivalent scope thereof.

Claims (23)

1. A medical information processing apparatus is characterized in that,
the disclosed device is provided with:
an acceptance unit for accepting selection of a target part or a target disease;
an extraction unit that extracts a past 1 st event corresponding to the selected target part or the target disease and a past 2 nd event corresponding to a part or a disease associated with the target part or the target disease; and
and a display control unit for mapping and displaying the 1 st event and the 2 nd event on a graph.
2. The medical information processing apparatus according to claim 1,
the display control unit displays the 1 st event and the 2 nd event in different areas.
3. The medical information processing apparatus according to claim 1,
the display control unit displays the 1 st event and the 2 nd event in the same area.
4. The medical information processing apparatus according to claim 1,
the display control unit displays the 2 nd event when a request is made from a user after the 1 st event is displayed.
5. The medical information processing apparatus according to claim 1,
the display control unit displays the 1 st event and the 2 nd event when receiving a predetermined operation for displaying a past event of a patient.
6. The medical information processing apparatus according to claim 1,
the display control unit displays the 1 st event and the 2 nd event in different forms.
7. The medical information processing apparatus according to claim 1,
the display control unit displays a plurality of medical images as thumbnails of the 1 st event and the 2 nd event.
8. The medical information processing apparatus according to claim 1,
the display control unit displays the 1 st event and the 2 nd event by mapping the events onto a plurality of graphs.
9. The medical information processing apparatus according to claim 1,
the extraction unit extracts the 1 st event and the 2 nd event using at least 1 of the master data in which the correspondence relationship between the parts and the events is set and the master data in which the correspondence relationship between the disease and the event is set.
10. The medical information processing apparatus according to claim 1,
the extraction unit extracts the 1 st event and the 2 nd event using at least 1 of the learned model that receives an input from a site and outputs a corresponding event and the learned model that receives an input from a disease and outputs a corresponding event.
11. The medical information processing apparatus according to claim 1,
the extraction unit extracts the 1 st event and the 2 nd event by using a learned model that receives an input of a part or an affected part and outputs a corresponding event.
12. The medical information processing apparatus according to claim 1,
the extraction unit specifies a part or a disease associated with the target part or the target disease using at least 1 of the master data in which the correspondence relationship between parts is set, the master data in which the correspondence relationship between diseases is set, and the master data in which the correspondence relationship between parts and diseases is set.
13. The medical information processing apparatus according to claim 1,
the extraction unit identifies a site or a lesion associated with the target site or the target lesion using at least 1 of the learned model for receiving an input of the site and outputting the associated site, the learned model for receiving an input of the lesion and outputting the associated lesion, the learned model for receiving an input of the lesion and outputting the associated site, and the learned model for receiving an input of the site and outputting the associated lesion.
14. The medical information processing apparatus according to claim 1,
the extraction unit identifies a part or a disease associated with the target part or the target disease by using a learned model of the part or the disease received and output.
15. The medical information processing apparatus according to claim 1,
the extraction unit identifies a part or a disease associated with the target part or the target disease using a learned model that receives an input of the part or the disease and outputs an event corresponding to the input part or the disease and an event corresponding to a part or a disease associated with the input part or the disease, and extracts the 1 st event and the 2 nd event.
16. The medical information processing apparatus according to claim 1,
the extraction unit extracts a predetermined number of events having high priority from the past events corresponding to the target part or the target disease and the events corresponding to the part or the disease associated with the target part or the target disease as the 1 st event and the 2 nd event.
17. The medical information processing apparatus according to claim 1,
the extraction unit extracts, in a range that can be displayed in a display region, an event having a high priority, from among a past event corresponding to the target part or the target disease and an event corresponding to a part or a disease associated with the target part or the target disease.
18. The medical information processing apparatus according to claim 1,
the display control unit causes the display unit to display a graphic;
the receiving unit receives an operation for the diagram displayed on the display unit from a user, and receives a selection of the target part or the target lesion.
19. The medical information processing apparatus according to claim 1,
the receiving unit receives a character input from a user, and receives a selection of the target part or the target lesion.
20. The medical information processing apparatus according to claim 1,
the receiving unit further receives a date designation;
the display control unit changes the color of the portion displayed on the graph according to the specified date.
21. The medical information processing apparatus according to claim 20,
the display control unit changes the color of the part displayed on the graph based on at least 1 of the severity of each part, the treatment state, and the elapsed time after the treatment at the time of the designated date.
22. A medical information processing method is characterized in that,
the method comprises the following steps:
receiving a selection of a target site or a target disorder;
extracting a past 1 st event corresponding to the selected target part or the target disease and a past 2 nd event corresponding to a part or a disease associated with the target part or the target disease;
the 1 st event and the 2 nd event are mapped on a graph and displayed.
23. A program, characterized in that,
causing a computer to execute the following processes:
receiving a selection of a target site or a target disorder;
extracting a past 1 st event corresponding to the selected target part or the target disease and a past 2 nd event corresponding to a part or a disease associated with the target part or the target disease;
The 1 st event and the 2 nd event are mapped on a graph and displayed.
CN202010330651.2A 2019-04-24 2020-04-24 Medical information processing device, medical information processing method, and program Active CN111863179B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-082625 2019-04-24
JP2019082625A JP7313888B2 (en) 2019-04-24 2019-04-24 MEDICAL INFORMATION PROCESSING APPARATUS, MEDICAL INFORMATION PROGRAM AND METHOD

Publications (2)

Publication Number Publication Date
CN111863179A true CN111863179A (en) 2020-10-30
CN111863179B CN111863179B (en) 2024-01-16

Family

ID=72985780

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010330651.2A Active CN111863179B (en) 2019-04-24 2020-04-24 Medical information processing device, medical information processing method, and program

Country Status (2)

Country Link
JP (1) JP7313888B2 (en)
CN (1) CN111863179B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4283629A1 (en) 2021-01-20 2023-11-29 FUJIFILM Corporation Document preparation assistance device, document preparation assistance method, and program

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004258978A (en) * 2003-02-26 2004-09-16 Toshiba Corp Virtual patient system, information providing system, and medical information providing method
WO2009034868A1 (en) * 2007-09-11 2009-03-19 Konica Minolta Medical & Graphic, Inc. Medical image system, medical image management device, data processing method, and program
WO2014007302A1 (en) * 2012-07-06 2014-01-09 株式会社湯山製作所 Compounding information administration system, compounding information administration method, compounding information administration program, and recording medium
US20160253460A1 (en) * 2013-11-14 2016-09-01 Fujifilm Corporation Diagnostic information display control device, method, and program
CN107847144A (en) * 2015-09-24 2018-03-27 株式会社日立制作所 Diagnosis aid system and diagnostic assistance method for information display
CN108231150A (en) * 2016-12-09 2018-06-29 青岛璐琪信息科技有限公司 A kind of inspection information system
CN109427419A (en) * 2017-08-24 2019-03-05 佳能医疗系统株式会社 Medical information processing system
CN109637622A (en) * 2013-07-16 2019-04-16 精工爱普生株式会社 Information processing unit, information processing method and information processing system

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001290883A (en) * 2000-04-05 2001-10-19 Sanyo Electric Co Ltd Medical examination assisting device
JP4959996B2 (en) * 2006-03-23 2012-06-27 株式会社東芝 Interpretation report display device
US10032236B2 (en) * 2007-04-26 2018-07-24 General Electric Company Electronic health record timeline and the human figure
WO2012093163A2 (en) * 2011-01-07 2012-07-12 Novartis Ag Display of clinical research data using an avatar
JP5701685B2 (en) * 2011-05-26 2015-04-15 富士フイルム株式会社 MEDICAL INFORMATION DISPLAY DEVICE, ITS OPERATION METHOD, AND MEDICAL INFORMATION DISPLAY PROGRAM
US20140038152A1 (en) * 2012-07-31 2014-02-06 Sandro Micieli Medical visualization method and system
JP6301116B2 (en) * 2013-11-22 2018-03-28 キヤノンメディカルシステムズ株式会社 Medical image display device
JP6356466B2 (en) * 2014-04-09 2018-07-11 日本メディカルソリューションズ株式会社 Medical record creation support system, server device, medical institution terminal, medical record creation support method, medical institution device, and medical record creation support program
JP6751682B2 (en) * 2017-03-09 2020-09-09 富士フイルム株式会社 Medical imaging controls, methods and programs

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004258978A (en) * 2003-02-26 2004-09-16 Toshiba Corp Virtual patient system, information providing system, and medical information providing method
WO2009034868A1 (en) * 2007-09-11 2009-03-19 Konica Minolta Medical & Graphic, Inc. Medical image system, medical image management device, data processing method, and program
WO2014007302A1 (en) * 2012-07-06 2014-01-09 株式会社湯山製作所 Compounding information administration system, compounding information administration method, compounding information administration program, and recording medium
CN109637622A (en) * 2013-07-16 2019-04-16 精工爱普生株式会社 Information processing unit, information processing method and information processing system
US20160253460A1 (en) * 2013-11-14 2016-09-01 Fujifilm Corporation Diagnostic information display control device, method, and program
CN107847144A (en) * 2015-09-24 2018-03-27 株式会社日立制作所 Diagnosis aid system and diagnostic assistance method for information display
CN108231150A (en) * 2016-12-09 2018-06-29 青岛璐琪信息科技有限公司 A kind of inspection information system
CN109427419A (en) * 2017-08-24 2019-03-05 佳能医疗系统株式会社 Medical information processing system

Also Published As

Publication number Publication date
JP2020181288A (en) 2020-11-05
JP7313888B2 (en) 2023-07-25
CN111863179B (en) 2024-01-16

Similar Documents

Publication Publication Date Title
US9122773B2 (en) Medical information display apparatus and operation method and program
JP6671322B2 (en) Medical information providing device, method of operating medical information providing device, and medical information providing program
US8934687B2 (en) Image processing device, method and program including processing of tomographic images
JP6334874B2 (en) Medical image display device and display control method in medical image display device
JP2007252609A (en) Diagnostic reading report display device
EP3276571A1 (en) Medical diagnosis assistance device, method for operating medical diagnosis assistance device, and medical diagnosis assistance system
JP6853144B2 (en) Medical information processing system
US20160085918A1 (en) Medical assistance device, operation method and operation program for medical assistance device, and medical assistance system
JP2017027266A (en) Information analysis support device, operation method of the same, operation program of the same, and information analysis support system
US20080229281A1 (en) Method for data exchange between medical apparatuses
CN111863179B (en) Medical information processing device, medical information processing method, and program
US20200342964A1 (en) Medical information processing apparatus, ordering system and method
US20210005310A1 (en) Order creation support apparatus and order creation support method
JP2010264234A (en) System and method for promoting utilization of medical information
EP3716277A1 (en) Medical care assistance device, and operation method and operation program therefor
WO2022024479A1 (en) Diagnosis assisting device, operating method and operating program therefor, and diagnosis assisting system
JP7462424B2 (en) Medical information processing device, learning data generation program, and learning data generation method
JP5958955B2 (en) Radiation information management system, radiation information management method, and radiation information management program
US20220277844A1 (en) Order management method and program, order management system, and database for medical practices
JP7313838B2 (en) Medical information processing system and medical information processing method
JP7209738B2 (en) Image analysis device, analysis function determination method, and analysis function determination program
JP7383440B2 (en) Diagnostic support system, diagnostic support device and program
US20220375555A1 (en) Medical information processing apparatus and medical information processing system
JP7171271B2 (en) Medical information display device
JP2023113416A (en) Medical information display device, medical information display method, and medical information display program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant