WO2021199952A1 - 情報処理方法、プログラムおよび情報処理システム - Google Patents

情報処理方法、プログラムおよび情報処理システム Download PDF

Info

Publication number
WO2021199952A1
WO2021199952A1 PCT/JP2021/009236 JP2021009236W WO2021199952A1 WO 2021199952 A1 WO2021199952 A1 WO 2021199952A1 JP 2021009236 W JP2021009236 W JP 2021009236W WO 2021199952 A1 WO2021199952 A1 WO 2021199952A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
information processing
time
patient
action
Prior art date
Application number
PCT/JP2021/009236
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
雄紀 坂口
悠介 関
陽 井口
Original Assignee
テルモ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by テルモ株式会社 filed Critical テルモ株式会社
Priority to JP2022511724A priority Critical patent/JP7699105B2/ja
Publication of WO2021199952A1 publication Critical patent/WO2021199952A1/ja

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H10/00ICT specially adapted for the handling or processing of patient-related medical or healthcare data
    • G16H10/60ICT specially adapted for the handling or processing of patient-related medical or healthcare data for patient-specific data, e.g. for electronic patient records

Definitions

  • the present invention relates to an information processing method, a program and an information processing system.
  • Catheter treatment is carried out with the cooperation of staff of various occupations such as doctors, nurses, clinical laboratory engineers, clinical engineers and radiologists. By sharing information on treatment progress and future prospects in real time, each staff member can fulfill their responsibilities efficiently. As a result, the treatment can proceed smoothly.
  • Patent Document 1 cannot support information sharing between staff members participating in treatment.
  • the information processing method acquires patient information about a patient undergoing catheter treatment, sequentially acquires the action related to the catheter treatment for the patient and the time when the action is performed, and obtains the patient information and the acquired patient information.
  • the computer is made to execute a process of displaying the acquired time-series information in which the actions are arranged in time-series on the first display device.
  • FIG. 1 is an explanatory diagram illustrating the configuration of the information processing system 10.
  • the information processing system 10 of the present embodiment is used in the catheter room and the operation room adjacent to the catheter room. Doors and windows (not shown) are provided between the catheter room and the operation room.
  • FIG. 1 schematically shows the information processing system 10. The arrangement and number of each component is not limited to FIG.
  • the catheter room is an example of a place where the information processing system 10 is used.
  • the information processing system 10 may be used in an operating room or the like.
  • the information processing system 10 includes an information processing device 20, a terminal device 30, a display device 12, a catheter control device 13, a microphone 17, a camera 11, a bedside monitor 16, a barcode reader 15, an diagnostic imaging device 14, and an electronic medical record system 19.
  • Each device or the like is connected by wire or wirelessly via a network such as HIS (Hospital Information System). Some devices may be directly connected to each other without going through a network.
  • HIS Hospital Information System
  • the information processing system 10 of the present embodiment is a system that supports catheter treatment performed in a catheter room. Patients undergoing catheterization lie on the examination table 18.
  • the terminal device 30 is a device mainly used by staff or visitors in charge of unclean areas in a catheter room or the like.
  • the terminal device 30 executes the program of the present embodiment in cooperation with the information processing device 20. Details of the information processing device 20 and the terminal device 30 will be described later.
  • the terminal device 30 is an example of the first display device.
  • the display device 12 is, for example, a large-sized display device such as a large-sized liquid crystal display device or an organic EL (Electro Luminescence) display device.
  • a plurality of display devices 12 are installed in the catheter chamber.
  • the display device 12 is suspended from the ceiling of the catheter room in a state where three units are arranged in a horizontal row, and is adjusted so as to be easily seen by a doctor in charge of catheter treatment. Further, the display device 12 is also arranged on the wall surface of the catheter chamber, the operation room adjacent to the catheter chamber, and the like.
  • the display device 12 is an example of the second display device.
  • the latest image taken by the X-ray angiography device is always displayed on one of the plurality of display devices 12 arranged in a state that is easy for the doctor in charge of catheter treatment to see. Is displayed.
  • the information displayed on the other display devices 12 will be described later.
  • the MDU 131 and the diagnostic imaging catheter 132 are connected to the catheter control device 13.
  • the diagnostic imaging catheter 132 is used by inserting it into a luminal organ.
  • the luminal organ into which the diagnostic imaging catheter 132 is inserted is, for example, a blood vessel, pancreatic duct, bile duct or bronchus.
  • the diagnostic imaging catheter 132 may be inserted into the heart such as in the atrium or the ventricle via a blood vessel.
  • the diagnostic imaging catheter 132 is, for example, an IVUS (Intravascular Ultrasound) catheter, an OCT (Optical Coherence Tomography) catheter, an OFDI (Optical Frequency Domain Imaging) catheter, or the like.
  • the diagnostic imaging catheter 132 may be equipped with a combination of two or three of the IVUS, OCT, and OFDI sensors.
  • diagnostic imaging catheter 132 in addition to the walls of luminal organs such as blood vessel walls, reflexes existing inside luminal organs such as red blood vessels, and outside luminal organs such as epicardium and heart. It is possible to generate a tomographic image containing the organs present in.
  • the diagnostic imaging catheter 132 may be a vascular endoscope used when optically observing the inner wall of a luminal organ.
  • the microphone 17 accepts voice input.
  • the bedside monitor 16 is a device that measures and displays the electrocardiogram, blood pressure, heart rate, and vital signs such as percutaneous arterial oxygen saturation (SPO 2: Peripheral Oxygen Saturation) of a patient undergoing catheter treatment. .. Electrodes for ECG measurement and the like are connected to the bedside monitor 16.
  • the camera 11 captures the situation inside the catheter chamber.
  • the captured image is displayed on the display device 12 in the operation room, for example.
  • the captured video may be recorded in the log DB65 (see FIG. 2) described later.
  • the camera 11 may be used for accepting gesture input.
  • the barcode reader 15 is used to read barcodes provided on packaging materials for equipment such as diagnostic imaging catheters 132, guiding catheters, guide wires, balloon catheters and stents used for catheter treatment. Information on the type of equipment, serial number, etc. is recorded on the barcode. A URL (Uniform Resource Locator) that can access information on detailed specifications of equipment and the like may be recorded in the barcode.
  • equipment such as diagnostic imaging catheters 132, guiding catheters, guide wires, balloon catheters and stents used for catheter treatment.
  • Information on the type of equipment, serial number, etc. is recorded on the barcode.
  • a URL Uniform Resource Locator
  • the diagnostic imaging apparatus 14 is, for example, an X-ray angiography apparatus, an X-ray CT (Computed Tomography) apparatus, an MRI (Magnetic Resonance Imaging) apparatus, a PET (Positron Emission Tomography) apparatus, or an ultrasonic diagnostic apparatus.
  • the image taken by the catheter control device 13 and the image taken by the diagnostic imaging device 14 may be comprehensively described as a medical image.
  • FIG. 2 is an explanatory diagram illustrating the configuration of the information processing device 20.
  • the information processing device 20 includes a control unit 21, a main storage device 22, an auxiliary storage device 23, a communication unit 24, and a bus.
  • the control unit 21 is an arithmetic control device that executes the program of the present embodiment.
  • One or more CPUs Central Processing Units
  • GPUs Graphics Processing Units
  • TPUs Transiver Processing Units
  • multi-core CPUs multi-core CPUs, and the like are used for the control unit 21.
  • the control unit 21 is connected to each hardware unit constituting the information processing device 20 via a bus.
  • the main storage device 22 is a storage device such as SRAM (Static Random Access Memory), DRAM (Dynamic Random Access Memory), and flash memory.
  • SRAM Static Random Access Memory
  • DRAM Dynamic Random Access Memory
  • flash memory temporary stores information necessary in the middle of processing performed by the control unit 21 and a program being executed by the control unit 21.
  • the auxiliary storage device 23 is a storage device such as a SRAM, a flash memory, a hard disk, or a magnetic tape.
  • the auxiliary storage device 23 stores a program to be executed by the control unit 21, a log DB (Database) 65, a standby device DB 66, a drug cumulative amount DB 67, and various data necessary for executing the program.
  • the log DB 65, the standby device DB 66, and the drug cumulative amount DB 67 may be stored in an external large-capacity storage device or the like connected to the information processing device 20.
  • the communication unit 24 is an interface for communicating between the information processing device 20 and the network.
  • the information processing device 20 is a general-purpose personal computer, a tablet, a large computer, or a virtual machine that operates on the large computer.
  • the information processing device 20 may be composed of a plurality of personal computers that perform distributed processing, or hardware such as a large computer.
  • the information processing device 20 may be configured by a cloud computing system or a quantum computer.
  • FIG. 3 is an explanatory diagram illustrating the configuration of the terminal device 30.
  • the terminal device 30 includes a control unit 31, a main storage device 32, an auxiliary storage device 33, a communication unit 34, a touch panel 35, and a bus.
  • the touch panel 35 includes a display unit 351 such as a liquid crystal panel or an organic EL panel, and an input unit 352 laminated on the surface of the display unit 351.
  • the control unit 31 of the terminal device 30 is an arithmetic control device that executes the program of the present embodiment in cooperation with the control unit 21 of the information processing device 20.
  • One or more CPUs, GPUs, multi-core CPUs, and the like are used for the control unit 31.
  • the control unit 31 is connected to each hardware unit constituting the terminal device 30 via a bus.
  • the main storage device 32 is a storage device for SRAM, DRAM, flash memory, or the like.
  • the main storage device 32 temporarily stores information necessary during the processing performed by the control unit 31 and the program being executed by the control unit 31.
  • the terminal device 30 is a general-purpose information device such as a tablet, a smartphone, or a tablet.
  • the terminal device 30 may be an information device dedicated to the information processing system 10 of the present embodiment.
  • the terminal device 30 may be individually carried by a staff member such as a nurse, or may be installed on a wall or the like.
  • FIG. 4 is an explanatory diagram illustrating the record layout of the log DB65.
  • the log DB 65 is a DB that records a catheter treatment log in which the patient ID, the date on which the catheter treatment was performed, and the progress of the catheter treatment are associated with each other.
  • the log DB65 has a patient ID field, a date field and a progress field.
  • the progress field has a time field, an action field, a file field, a patient status field, a vital field, and a strategy field.
  • the patient ID is recorded in the patient ID field.
  • the date field records the date of catheterization.
  • the time is recorded in the time field.
  • the action field records the actions taken during catheterization.
  • a file such as an image acquired from the catheter control device 13 or the diagnostic imaging device 14 and a voice memo acquired from the microphone 17 is recorded. Links to these files are recorded in the file field, and the file itself may be recorded in a file server or the like (not shown). In the file field, a moving image acquired from the catheter control device 13 or the diagnostic imaging device 14 may be recorded.
  • the patient status field In the patient status field, the patient status is recorded with a code such as "A" or "B". Details of the patient's condition will be described later.
  • the vital field In the vital field, the vitals when the doctor confirms the vitals are recorded.
  • various vital log data are recorded by the bedside monitor 16.
  • the action of administering the "heparin XXX unit" to the patient is recorded.
  • Patient status and vitals at the time of administration are recorded, but files and strategies are not recorded.
  • the patient's condition was recorded, and no actions were recorded. Details of the data recording method in the log DB 65 will be described later.
  • FIG. 5 is an explanatory diagram illustrating the record layout of the standby device DB66.
  • the standby device DB 66 is a DB that records the equipment used in the catheter room and the number of times of use in association with each other.
  • the standby device DB 66 has an equipment information field and a usage count field.
  • the equipment information field has a product type field and a type field.
  • the variety of each equipment is recorded in the variety field.
  • the type of equipment is recorded in the type field. For example, "GW” indicates a guide wire, "BDC” indicates a balloon dilatation catheter, and IVUS indicates a catheter for IVUS.
  • the type of equipment may include information on specifications such as the outer diameter and length of the equipment. For example, "***** 3.0 mm x 10 mm" recorded in the "BDC" record indicates that the balloon has a diameter of 3.0 mm and a length of 10 mm when expanded. In the number-of-use field, the number of times the device was inserted into the body during this operation is recorded.
  • the outline of the data recording method in the equipment information field will be explained.
  • Equipment used for catheter treatment is brought into the catheter room.
  • the staff operates the barcode reader 15 to read the barcode attached to the packing material of each equipment.
  • information such as the type of equipment is recorded on the barcode.
  • the bar code reader 15 transmits the read information to the information processing device 20 via the network.
  • the control unit 21 creates a new record in the standby device DB 66.
  • the control unit 21 determines the type of equipment based on the product type recorded in the barcode read by the barcode reader 15.
  • the control unit 21 may transmit the product type to the equipment manufacturer's WEB site or the like to receive the equipment type.
  • the control unit 21 records the type of equipment in the product type field and the type of equipment in the type field.
  • the control unit 21 sets the usage count field to the initial value "0".
  • FIG. 6 is an explanatory diagram for explaining the record layout of the drug cumulative amount DB67.
  • the drug cumulative amount DB 67 is a DB that records the name of the drug and the cumulative dose in association with each other.
  • the drug cumulative amount DB67 has a drug field and a dose field. The name of the drug is recorded in the drug field. Cumulative doses of the drug are recorded in the dose field. The initial value of the dose field is "0".
  • FIG. 7 and 8 are examples of screens displayed by the terminal device 30.
  • FIG. 7 shows an example of a screen displayed on the touch panel 35 of the terminal device 30 used by the staff in charge of supporting catheter treatment.
  • the screen shown in FIG. 7 includes a patient information field 51, a time series information field 52, a device button 53, a medication button 54, a vital button 55, and a patient status input field 56.
  • the date and current time are displayed in the upper right corner of the screen.
  • Table 1 is an example of patient information displayed on the display unit 35 by the control unit 31 of the terminal device 30 when the user selects the patient information field 51.
  • the control unit 31 may display the data recorded in the electronic medical record system 19 such as an X-ray CT image taken prior to the catheter treatment on the touch panel 35.
  • the items of detailed information may be set differently depending on the type of job of the user who uses each terminal device 30.
  • time-series information column 52 various actions related to catheter treatment and the time of occurrence of the actions are displayed in chronological order.
  • Each control unit 31 displays the time series information field 52 in a predetermined format based on the log DB 65.
  • Each control unit 31 sequentially accesses the log DB 65 to update the display of the time series information column 52.
  • the latest status is displayed in the time series information column 52 of any terminal device 30.
  • heparin was administered in XXX units.
  • the vital confirmation mark 524 indicates that the doctor confirmed the patient's vitals during heparin administration. At the time of heparin administration, it is indicated by the patient condition mark 523 indicating a smile that the patient does not feel pain or the like and is in a good condition.
  • CAG coronary angiography
  • a thumbnail image 522 which is a reduced image taken by coronary angiography, is displayed in the time series information column 52.
  • the thumbnail image is an example of a reduced image.
  • a tomographic image was taken with an IVUS catheter.
  • a thumbnail image 522 which is a reduced image of the captured tomographic image, is displayed in the time-series information column 52.
  • an input field 526 for accepting input of the next item is displayed. The operation method of the information processing system 10 using the screen shown in FIG. 7 will be described later.
  • FIG. 8 is an example of a screen displayed by the terminal device 30.
  • FIG. 8 shows an example of a screen displayed on the terminal device 30 used by the staff in the operation room.
  • the control unit 21 may display the screen shown in FIG. 8 on a part of the display devices 12.
  • FIG. 9 is a modified example of the time series information column 52.
  • the time is displayed at the left end of the time series information column 52.
  • the action at each time is displayed separately in a column of large items such as “drug administration” and a column of small items such as “heparin XXX unit”.
  • the balloon expansion pressure or stent expansion pressure when the balloon and the stent are expanded, and the expansion time are displayed.
  • balloon dilation was performed to dilate the stenosis using a balloon dilatation catheter.
  • Thumbnail images 522 of medical images taken by coronary angiography during or before and after the balloon dilation indicate where the balloon dilation was performed.
  • the control unit 21 of the information processing device 20 or the control unit 31 of the terminal device 30 may receive an instruction from the user to select the diagnostic imaging device 14 for displaying the original image.
  • the control unit 31 acquires the original image from the log DB 65 or the electronic medical record system 19 and displays it on the touch panel 35. For example, a visitor such as a student can check the original image without affecting the work of the staff who is actually performing catheter treatment.
  • FIG. 10 to 20 are examples of screens displayed by the terminal device 30.
  • FIG. 10 shows an example of a screen displayed by the control unit 31 when the selection of the device button 53 is accepted. Next to the device button 53, the equipment list column 539 is displayed.
  • the device icon 531 corresponding to each equipment recorded in the standby device DB66 is displayed. It is desirable that the device icon 531 has a design that allows the type of equipment and main specifications to be easily identified.
  • a usage count column 532 Next to the device icon 531 is a usage count column 532.
  • the usage count recorded in the usage count field of the standby device DB 66 is displayed.
  • the user can confirm the equipment that is ready for use and the number of times the equipment has been used by the equipment list column 539.
  • the staff drags and drops the device icon 531 corresponding to the selected equipment to the time series information field 52.
  • the staff may perform a double tap operation or the like instead of the drag and drop operation.
  • the drag-and-drop operation and the double-tap operation are examples of a selection operation in which the user selects equipment from the equipment list column 539.
  • the user may input additional information regarding the use of the equipment by voice input or the like.
  • the control unit 21 of the information processing device 20 records the input additional information in the log DB 65 together with the information about the selected equipment.
  • the control unit 31 of the terminal device 30 transmits information regarding the device icon 531 that has been dragged and dropped to the information processing device 20.
  • the control unit 21 of the information processing device 20 creates a new record in the log DB 65, and records the equipment used and the time when the use is started.
  • the control unit 21 extracts the record corresponding to the equipment that has been dragged and dropped from the standby device DB 66, and adds 1 to the usage count field.
  • FIG. 11 shows an example of a screen displayed by the control unit 31 of the terminal device 30 when the selection of the medication button 54 is accepted.
  • a drug list column 549 is displayed below the dosing button 54.
  • a drug button 541 corresponding to the combination of the drug used in the catheter examination and the dose is displayed.
  • the staff taps the corresponding drug button 541.
  • information on the type or setting of the pump may be registered in the log DB 65.
  • the control unit 31 of the terminal device 30 transmits information regarding the tapped dosing button 54 to the information processing device 20.
  • the control unit 21 extracts the record corresponding to the tapped drug from the drug cumulative amount DB 67 and adds the dose to the dose field.
  • the control unit 21 creates a new record in the log DB 65 and records the administered drug, the dose, and the cumulative dose.
  • FIG. 12 shows an example of a screen displayed by the control unit 31 of the terminal device 30 when the selection of the vital button 55 is accepted.
  • the vital graph column 551 is displayed below the vital button 55.
  • the vital graph column 551 for example, a graph similar to the screen of the bedside monitor 16 is displayed.
  • Vital graph column 551 is an example of patient vital information.
  • the control unit 21 receives the tap operation of the vital button 55 via the control unit 31 and records the vitals in the vital field of the log DB 65.
  • the vital confirmation mark 524 in the time series information column 52 indicates that vitals are recorded in the vital field.
  • the staff in the operation room can grasp that the doctor has confirmed the vitals by the vital confirmation mark 524, and can pay attention to the patient's vital condition without waiting for the direct instruction from the doctor.
  • the control unit 31 of the terminal device 30 may display the vitals recorded in the vital field of the log DB65 in the time series information column 52 instead of the vital confirmation mark 524.
  • the control unit 21 may display a mark that can identify whether the vital recorded in the vital field is within the normal range or an abnormal value.
  • the point of interest written by hand on the vital information displayed on the display device 12 or the terminal device 30 in the operation room may be similarly displayed in the vital graph column 551 of the terminal device 30 in the catheter room. .. It is possible to call attention by pointing out changes in vital signs that the surgeon is not aware of from the operation room. Input from the operation room may be performed via a touch panel, or may be performed via a mouse or a keyboard.
  • FIG. 13 shows a screen of a modified example. If the vital button 55 is not selected, the vital value field 552 is displayed on the screen. In the vital value column 552, the value of vitals acquired by the bedside monitor 16 is always displayed.
  • the control unit 21 of the information processing device 20 receives the tap operation of the vital button 55 via the control unit 31 of the terminal device 30 and records the vitals in the vital field of the log DB 65.
  • Nurses, etc. judge the patient's condition by listening to the presence or absence of pain from the patient.
  • the nurse may observe the patient and determine the patient's condition.
  • the nurse taps and operates an appropriate patient status icon from the plurality of patient status icons displayed in the patient status input field 56.
  • the control unit 21 of the information processing device 20 receives a tap operation of the patient status icon via the control unit 31 of the terminal device 30, and records a code corresponding to the selected patient status icon in the patient status field of the log DB 65. ..
  • the patient status mark 523 in the time series information column 52 indicates the patient status.
  • FIG. 14 shows a screen when a doctor or staff inputs a voice memo.
  • the microphone icon indicates that the voice memo was input at 10:34.
  • the control unit 21 creates a new record in the log DB 65 and records the voice memo data in the file field.
  • the strategy mark 525 will be explained.
  • the catheter control device 13 when the pullback operation is performed using IVUS, the catheter control device 13 generates a longitudinal tomographic image and a transverse layer image.
  • the catheter control device 13 may have a function of detecting a lesion portion based on a tomographic image and a function of displaying a basis for the detected lesion.
  • the doctor formulates a treatment strategy based on the information such as the tomographic image generated by the catheter control device 13 and the examinations performed in the past.
  • the physician may develop a treatment strategy when performing coronary angiography.
  • a strategy file summarizing the planned treatment strategy and data such as referenced images is recorded in the strategy field of the log DB65.
  • the strategy mark 525 in the time series information column 52 means that the strategy file is recorded in the log DB 65.
  • the strategy file may be recorded in the electronic medical record system 19.
  • the strategy file is displayed on the touch panel 35 or the display device 12.
  • the user can confirm the judgment of the doctor and the rationale thereof, and prepare for the next work.
  • FIG. 15 shows an example of a strategy file when a doctor devises a treatment strategy for performing “stent placement”.
  • the catheter control device 13 detects "dissociation” based on the tomographic image, and outputs the basis for detecting "dissociation” by Grad-CAM (Gradient-weighted Class Activation Mapping).
  • Grad-CAM Grad-CAM
  • the doctor decides a treatment strategy to perform "stent placement" based on the image or the like generated by the catheter control device 13, and records it in the strategy file.
  • the treatment strategy and the images referenced by the doctor are summarized in the strategy file.
  • FIG. 16 shows an example of a strategy file when the doctor determines the dimensions of the stent to be placed. Based on the doctor's instructions, the staff will measure the length of the stenosis and the diameter of the blood vessel from the tomographic image. The doctor selects the stent to use based on the measurement results and records it in the strategy file.
  • FIG. 17 shows an example of a screen used when the doctor in charge of catheter treatment records supplementary information in the log DB 65 after the catheter treatment is completed.
  • the doctor displays the screen shown in FIG. 17 on the screen of the terminal device 30 or the personal computer.
  • the patient information column 51 and the time series information column 52 are displayed.
  • FIG. 17 shows an example in which the patient status mark 523 is not displayed in the time series information column 52.
  • the patient status mark 523 may be displayed in the time series information column 52.
  • the name of the user logged on to the information processing system 10 is displayed in the user name field 529 at the bottom of the screen.
  • a correction button 571, an approval button 572, and an output button 573 are displayed at the top of the screen.
  • the doctor records supplementary information that could not be recorded during catheter treatment in the additional comment section 528.
  • the doctor can enter any comment in the additional comment field 528 by keyboard operation or voice input.
  • the additional comment field 528 may accept the selection of a typical comment sentence by, for example, a pull-down menu or the like.
  • the doctor selects the correction button 571.
  • the comment input in the additional comment field 528 is reflected in the log DB 65.
  • the doctor selects the output button 573 the screen shown in FIG. 17 is printed.
  • the predetermined approver logs on and selects the approval button 572 the data recorded in the log DB 65 is recorded in the electronic medical record system 19.
  • FIG. 18 shows an example of a screen used by a nurse to record supplementary information in the log DB 65 after the catheter treatment is completed. Instead of the approval button 572 in FIG. 17, a confirmation button 576 is displayed. The patient status mark 523 is displayed in the time series information column 52.
  • the nurse can enter any comment in the additional comment field 528 by keyboard operation or voice input. After completing the input, the nurse selects the correction button 571.
  • the comment input in the additional comment field 528 is tentatively recorded in the auxiliary storage device 23.
  • the doctor in charge of catheter treatment selects the confirmation button 576, the comment input in the additional comment field 528 is reflected in the log DB 65.
  • Clinical laboratory engineers, clinical engineers, etc. also enter additional comments based on their respective duties.
  • the records by the staff of each occupation can be aggregated in the log DB65. Since it is not necessary for the staff of each occupation to input the same information more than once, it is possible to provide the information processing system 10 that can reduce the load on the staff.
  • FIGS. 19 and 20 show an example of a screen for performing workflow analysis related to catheter treatment based on the catheter treatment log.
  • the screen shown in FIG. 19 is displayed on the terminal device 30 or a personal computer connected to the HIS.
  • the analysis result column 578 and the comment input button 575 are displayed on the screens shown in FIGS. 19 and 20.
  • the analysis result column 578 of FIG. 19 includes a time series diagram 58 and a legend column 589.
  • the time series diagram 58 shows the passage of time from left to right, and is a diagram showing the work performed at each time by hatching. For convenience of illustration, some hatching types are duplicated. The steps indicated by each hatch are displayed in characters in the legend column 589.
  • the time of X-ray irradiation is displayed on the lower side of the time series FIG. 58 based on the radiation irradiation log recorded in the diagnostic imaging apparatus 14. Since the method of creating the time series diagram 58 based on the log data is known, the description thereof will be omitted.
  • the user can confirm which work took how long.
  • the user can input a comment by selecting the comment input button 575.
  • the entered comment may be displayed at the bottom of the screen, for example.
  • the analysis result column 578 of FIG. 20 includes a first time series diagram 581 and a second time series diagram 582.
  • the first time series FIG. 581 shows the time series of actual cases.
  • Second Time Series FIG. 582 shows a time series of standard procedure times for similar treatment procedures. Standard procedure times are statistically created based on, for example, past cases.
  • the screen shown in FIG. 20 allows the user to compare the standard procedure time with the time required for the actual procedure.
  • the user can enter a comment that considers, for example, a task that took longer than the standard procedure time.
  • the workflow analysis result may be displayed in any format such as a table format or a graph format.
  • FIG. 21 to 26 are flowcharts for explaining the flow of program processing.
  • FIG. 21 shows a program for recording equipment in the standby device DB 66.
  • the equipment used for catheter treatment is brought into the catheter room.
  • the staff operates the barcode reader 15 to read the barcode attached to the packing material of each equipment.
  • the bar code reader 15 transmits the read bar code information to the information processing device 20. After that, the program shown in FIG. 21 is executed.
  • the control unit 21 of the information processing device 20 receives the barcode information transmitted from the barcode reader 15 (step S501).
  • the control unit 21 acquires equipment information regarding the type and type of equipment based on the barcode information (step S502).
  • Equipment information is included in, for example, barcode information.
  • the equipment information may be acquired by accessing the URL shown in the barcode information.
  • the control unit 21 creates a new record in the standby device DB 66.
  • the control unit 21 records the product type and type acquired in step S502 in the product type field and type field.
  • the control unit 21 records an initial value of 0 in the usage count field.
  • the carried-in equipment is recorded in the standby device DB 66 (step S503).
  • the control unit 21 ends the process.
  • serial number read from the barcode may be recorded in the standby device DB 66.
  • FIG. 22 shows a program that is activated when a medical image is taken by the catheter control device 13 or the diagnostic imaging device 14.
  • the catheter control device 13 or the diagnostic imaging device 14 transmits the captured image data or link information indicating the storage destination of the image data to the information processing device 20. After that, the program shown in FIG. 22 is executed.
  • the control unit 21 of the information processing device 20 acquires the captured image data (step S511).
  • the control unit 21 reduces the image acquired in step S511 to create a thumbnail image (step S512).
  • the control unit 21 may acquire thumbnail images created by the catheter control device 13 or the diagnostic imaging device 14.
  • the control unit 21 creates a new record in the log DB65.
  • the control unit 21 records the time when the image was taken in the time field, the information about the device that took the image in the action field, and the file of the image acquired in step S511 and the thumbnail image file created in step S512 in the file field. (Step S513).
  • the control unit 21 records the patient status in the patient status field. Similarly, when the vitals are confirmed, the control unit 21 records the vitals in the vital field. When a treatment strategy is devised based on the captured image, the control unit 21 records a file in which the treatment strategy is recorded in the strategy field. After that, the control unit 21 ends the process.
  • FIG. 23 shows a program that is activated when a tap of the thumbnail image 522 displayed on the touch panel 35 is received.
  • the control unit 31 of the terminal device 30 determines whether or not the terminal device 30 is set to display the original image on the display device 12 when the tap of the thumbnail image 522 is received (step S621).
  • the control unit 31 of the terminal device 30 transmits information regarding the tapped thumbnail image 522 to the information processing device 20 (step S622).
  • the information about the thumbnail image 522 is, for example, the time when the thumbnail image was taken.
  • the control unit 31 ends the process.
  • the control unit 21 of the information processing device 20 receives information about the tapped thumbnail image (step S521).
  • the control unit 21 searches the log DB 65 and acquires the original image (step S522).
  • the control unit 21 transmits the original image to the predetermined display device 12 (step S523). As a result, the original image is displayed on the display device 12.
  • the control unit 31 of the terminal device 30 transmits information regarding the tapped thumbnail image 522 to the information processing device 20. (Step S623).
  • the control unit 31 of the terminal device 30 receives the original image (step S624).
  • the control unit 31 displays the original image on the touch panel 35 (step S625).
  • the control unit 31 ends the process.
  • FIG. 24 shows a program to be started when the device button 53 is selected by the user.
  • the control unit 31 of the terminal device 30 transmits information indicating that the device button 53 has been selected to the information processing device 20 (step S631).
  • the control unit 21 of the information processing device 20 receives the information (step S531).
  • the control unit 21 transmits the standby device DB 66 to the terminal device 30 (step S532).
  • the control unit 31 of the terminal device 30 receives the standby device DB 66 (step S632).
  • the control unit 31 displays the equipment list column 539 as described with reference to FIG. 10 (step S633).
  • the control unit 31 accepts a selection operation such as a drag-and-drop operation of the device icon 531 (step S634).
  • the control unit 31 transmits information about the equipment that has received the selection operation to the information processing device 20 (step S635).
  • the control unit 21 creates a new record in the log DB 65 (step S535).
  • the control unit 21 records the time when the selection operation is received in the time field and the information about the device which received the selection operation in the action field.
  • the control unit 21 determines whether or not the equipment that has received the selection is the diagnostic imaging catheter 132 (step S536). If it is determined that the catheter is not the diagnostic imaging catheter 132 (NO in step S536), the control unit 21 is in a standby state waiting for the next input. When it is determined that the catheter 132 is for diagnostic imaging (YES in step S536), the control unit 21 determines whether or not the catheter control device 13 has recorded an image (step S537).
  • step S537 If it is determined that the image has not been recorded (NO in step S537), the control unit 21 is in a standby state waiting for the next input. When it is determined that the image has been recorded (YES in step S537), the control unit 21 activates the image recording subroutine (step S538).
  • the image recording subroutine is a program that performs the same processing as the program described with reference to FIG. The difference is that the file is recorded in the file field of the record created in step S535 without creating a new record in step S513.
  • the control unit 31 of the terminal device 30 determines whether or not to end the display of the equipment list column 539 (step S636). For example, when the user instructs the end by a double tap operation or the like, the control unit 31 determines that the display of the equipment list column 539 is finished.
  • step S636 If it is determined that the process does not end (NO in step S636), the control unit 31 returns to step S634. When it is determined that the process is completed (YES in step S636), the control unit 31 hides the equipment list column 539 (step S637). The control unit 31 ends the process.
  • FIG. 25 shows a program that is activated when the medication button 54 is selected by the user.
  • the control unit 31 of the terminal device 30 displays the drug list column 549 as described with reference to FIG. 11 (step S641).
  • the control unit 31 accepts the selection of the drug button 541 (step S642).
  • the control unit 31 transmits the type and dose of the selected drug to the information processing device 20 (step S643).
  • the control unit 21 of the information processing device 20 receives the type and dose of the drug for which selection has been accepted (step S541).
  • the control unit 21 updates the drug cumulative amount DB67 (step S542). Specifically, the control unit 21 extracts a record related to the drug received in step S541 from the drug cumulative amount DB 67.
  • the control unit 31 adds the dose received in step S541 to the dose field.
  • FIG. 26 shows a program to be activated when the vital button 55 is selected by the user.
  • the control unit 31 of the terminal device 30 transmits the vital data transmission start instruction to the bedside monitor 16 (step S651).
  • the bedside monitor 16 receives the transmission start instruction (step S751).
  • the bedside monitor 16 starts transmitting vital data to the terminal device 30 (step S752).
  • the vital data is, for example, image data showing a screen displayed by the bedside monitor 16.
  • the vital data may be numerical data indicating the value of each item displayed by the bedside monitor 16.
  • the bedside monitor 16 sequentially transmits vital data to the terminal device 30.
  • the control unit 31 of the terminal device 30 receives vital data (step S652).
  • the control unit 31 displays the vital graph column 551 as described with reference to FIG. 12 (step S653).
  • the control unit 31 determines whether or not the tap operation of the vital graph column 551 has been accepted (step S654).
  • control unit 31 transmits the vitals displayed on the touch panel 35 when the tap operation is accepted to the information processing device 20 (step S655).
  • the control unit 21 receives vitals (step S551).
  • the control unit 21 of the information processing device 20 records the received vitals in the log DB 65 (step S552). For example, when the vitals are confirmed before and after an action such as drug administration, the control unit 21 records the vitals in the vital field of the record in which the action is recorded. When no other action is taken, the control unit 21 creates a new record in the log DB 65 and records the vitals.
  • step S654 When it is determined that the operation is not accepted (NO in step S654), or after the end of step S655, the control unit 31 of the terminal device 30 determines whether or not to end the display of the vital graph column 551 (step S656). .. For example, when the user instructs the end by a double tap operation or the like, the control unit 31 determines that the display of the vital graph column 551 is finished.
  • step S656 If it is determined that the process does not end (NO in step S656), the control unit 31 returns to step S652. When it is determined to end (YES in step S656), the control unit 31 transmits a vital transmission end instruction to the bedside monitor 16 (step S657). The control unit 21 hides the vital graph column 551 (step S658). The control unit 21 ends the process.
  • the bedside monitor 16 receives the transmission end instruction (step S753).
  • the bedside monitor 16 ends the transmission of vital data to the terminal device 30 (step S754).
  • the control unit 31 of the terminal device 30 may accept an operation by voice input via a microphone built in the terminal device 30 or a microphone 17 instead of the touch panel 35. Doctors and nurses who are in the process of cleaning can operate the terminal device 30 by themselves without touching the terminal device 30.
  • an information processing method or the like that supports information sharing between staff members participating in treatment.
  • an information processing system 10 that can record a catheter treatment log and share it in real time.
  • an information processing system 10 that can easily record the use of equipment and the administration of a drug in a catheter treatment log. According to this embodiment, it is possible to provide an information processing system 10 that can easily record a patient state. According to this embodiment, it is possible to provide an information processing system 10 that can easily record that a doctor or the like pays attention to vital signs.
  • an information processing system 10 capable of confirming the equipment ready for use and the number of times it has been used by the screen described with reference to FIG.
  • an information processing system 10 that records a treatment strategy determined by a doctor based on new information obtained by using the catheter control device 13 and the diagnostic imaging device 14 in a catheter treatment log. ..
  • an information processing system 10 that allows staff and visitors participating in catheter treatment to view a treatment strategy devised by a doctor and captured images in real time.
  • an information processing system 10 in which staff of each occupation can add comments and the like to the catheter treatment log after the catheter treatment is completed. According to this embodiment, it is possible to provide an information processing system 10 that automatically records a catheter treatment log that can be used for workflow analysis.
  • FIG. 27 is an example of a screen displayed by the terminal device 30 of the second embodiment.
  • the screen shown in FIG. 27 is a screen displayed in place of the screen described with reference to FIG. 7.
  • a thumbnail image 522 of the object arrangement image is displayed between the thumbnail image 522 of the image taken by IVUS at 10:30 and the strategy mark 525.
  • FIG. 28 is an explanatory diagram illustrating the configuration of the first model 61.
  • the first model 61 is a model that accepts the original image and outputs an object arrangement image that maps the types of a plurality of objects included in the original image and the range of each object in association with each other.
  • the first model 61 is generated by machine learning.
  • the object arrangement image is an example of a converted image obtained by converting the original image.
  • the thumbnail of the object arrangement image is an example of a reduced conversion image obtained by reducing the converted image.
  • the vertical line hatching is the "cross section of the diagnostic imaging catheter 132"
  • the downward-sloping hatching is "inside the luminal organ”
  • the thick left-sloping hatching is “epicardium”. "Is shown respectively.
  • FIG. 28 schematically shows that each object is painted in a different color.
  • Coloring objects is an example of how to display each object separately. It may be displayed so as to be distinguishable from other objects by any aspect such as surrounding the outer edge of each object.
  • control unit 21 may classify the entire surface of the object arrangement image into some object such as “luminal organ wall”, “plaque”, “calcification”, “guide wire”, etc., and display them separately.
  • the first model 61 is, for example, a semantic segmentation model, and includes an input layer, a neural network, and an output layer.
  • the neural network has a U-Net structure that realizes semantic segmentation.
  • the U-Net structure is composed of a multi-layered encoder layer and a multi-layered decoder layer connected behind the multi-layered encoder layer. Semantic segmentation assigns a label to each pixel that makes up the input image, indicating the type of object.
  • the first model 61 is generated for each device that captures the original image.
  • the first model 61 may be further generated for each part to be imaged.
  • the first model 61 is recorded in the auxiliary storage device 23.
  • the first model 61 may be recorded in a large-capacity device or the like connected to the information processing system 10.
  • the object arrangement image of the original size is displayed on the display device 12 or the touch panel 35.
  • FIG. 29 is a flowchart illustrating a processing flow of the program of the second embodiment.
  • the program of FIG. 29 is executed in place of the program described with reference to FIG. 22 when an image is taken by the catheter control device 13 or the diagnostic imaging device 14.
  • the catheter control device 13 and the diagnostic imaging device 14 transmit the captured image data or link information indicating the storage destination of the image data to the information processing device 20. After that, the program shown in FIG. 29 is executed.
  • the control unit 21 acquires the captured image data (step S561).
  • the control unit 21 reduces the image acquired in step S561 to create a thumbnail image (step S562).
  • a thumbnail image of the object arrangement image is also created in step S562.
  • the control unit 21 may acquire thumbnail images created by the catheter control device 13 or the diagnostic imaging device 14.
  • the control unit 21 determines whether or not to perform conversion to the object arrangement image (step S563). Specifically, when the first model 61 corresponding to the conversion of the image data received in step S561 exists, the control unit 21 determines that the conversion is performed.
  • control unit 21 When it is determined that the conversion is not performed (NO in step S563), the control unit 21 creates a new record in the log DB65.
  • the control unit 21 records the time when the image was taken in the time field, the information about the device that took the image in the action field, and the file of the image acquired in step S561 and the thumbnail image file created in step S562 in the file field. (Step S564).
  • the control unit 21 ends the process.
  • the control unit 21 selects the first model 61 corresponding to the image data received in step S561 (step S565).
  • the control unit 21 inputs the image data acquired in step S561 into the first model 61 selected in step S565, and acquires an output object arrangement image (step S566).
  • the control unit 21 reduces the acquired object arrangement image to create a thumbnail image (step S567).
  • the control unit 21 creates a new record in the log DB65.
  • the control unit 21 acquires the time when the image was taken in the time field and the information about the device that took the image in the action field in the file field, the image acquired in step S561, the thumbnail image created in step S562, and step S566.
  • the file of the object arrangement image and the thumbnail image created in step S567 is recorded (step S568).
  • the control unit 21 ends the process.
  • the first model 61 is not limited to the model that converts the original image into the object arrangement image.
  • it may be a model that converts the original image into an image that has undergone arbitrary processing such as edge enhancement or contrast conversion.
  • an information processing system 10 that displays an object arrangement image.
  • the user can easily interpret the image taken by the catheter control device 13 or the diagnostic imaging device 14 by referring to the object arrangement image.
  • the present embodiment relates to an information processing system 10 that automatically detects lesions and creates a treatment strategy.
  • the description of the parts common to the first embodiment will be omitted.
  • FIG. 30 is an explanatory diagram illustrating the configuration of the second model 62.
  • the second model 62 is a model that accepts the input of a medical image and outputs the type of lesion.
  • the second model 62 may be a model that accepts input of a plurality of medical images and outputs the type of lesion.
  • the plurality of medical images are, for example, images taken by the same device at different timings or under different imaging conditions.
  • the plurality of medical images may be images taken using different devices.
  • the second model 62 is generated for each device on which the medical image was taken and the part where the medical image was taken, and is recorded in the auxiliary storage device 23 or an external large-capacity storage device.
  • the second model 62 is a learning model generated by using an image classification algorithm such as CNN (Convolutional Neural Networks) or YOLO (You only look once).
  • FIG. 31 is an explanatory diagram illustrating the configuration of the third model 63.
  • the third model 63 is a model that receives a medical image taken by the catheter control device 13 or the diagnostic imaging device 14 and outputs a treatment strategy.
  • the treatment strategy output by the third model 63 is a type of treatment procedure, such as the "stent placement" described with reference to FIG.
  • the treatment strategy output by the third model 63 may be the specifications of the equipment used in the treatment procedure, such as the dimensions of the stent described with reference to FIG.
  • the third model 63 is generated for each device in which the medical image was taken, the site where the medical image was taken, the lesion to be treated, and the time when the medical image was taken, and is generated by the auxiliary storage device 23 or an external large-capacity storage. Recorded on the device.
  • the time when the medical image is taken indicates, for example, the time during a series of catheter treatments such as immediately after the start of catheter treatment, after balloon expansion, and after stent placement.
  • the third model 63 is a model generated by using, for example, a CNN or YOLO algorithm.
  • the third model 63 may be a model that accepts input of patient information, a lesion site, or a catheter treatment log recorded before taking a medical image in addition to a medical image, and outputs a treatment strategy.
  • FIG. 32 is a flowchart illustrating a processing flow of the program of the third embodiment.
  • the program of FIG. 32 is executed when the doctor directs the planning of a treatment strategy.
  • the doctor specifies the medical image to be used.
  • the control unit 21 of the information processing device 20 selects the second model 62 to be used based on the device that captured the medical image designated by the doctor and the portion where the medical image was captured (step S571).
  • the control unit 21 inputs the medical image designated by the doctor into the second model 62 selected in step S571 to acquire the type of lesion (step S572).
  • the control unit 21 acquires the basis region regarding the type of lesion acquired in step S572 by the algorithm of Grad-CAM (Gradient-weighted Class Activation Mapping) (step S573).
  • the control unit 21 selects the third model 63 to be used based on the device that took the medical image designated by the doctor, the site where the medical image was taken, the lesion to be treated, and the time when the medical image was taken. (Step S574).
  • the control unit 21 inputs the medical image designated by the doctor into the third model selected in step S574 to acquire the treatment strategy (step S575).
  • the "stent placement" displayed at the bottom is the treatment strategy acquired in step S575.
  • the control unit 21 records the acquired treatment strategy in the log DB 65 (step S576).
  • the control unit 21 ends the process.
  • an information processing system 10 that determines the type of lesion and formulates a treatment strategy.
  • control unit 21 may accept the type of lesion input by the user.
  • control unit 21 may accept a treatment strategy input by the user.
  • the present embodiment relates to a method of generating the first model described with reference to FIG. The description of the parts common to the second embodiment will be omitted.
  • a medical image is recorded in the input data field.
  • a painted image in which a medical image is painted in a different color for each object by a specialist such as a doctor is recorded. That is, in the output data field, an object corresponding to each pixel constituting the medical image is recorded.
  • the training DB a large amount of combinations of medical images and painted images painted by specialists and the like are recorded. The training DB is prepared for each combination of a device for capturing a medical image and a portion to be imaged.
  • FIG. 34 is a flowchart illustrating a processing flow of the program of the fourth embodiment. A case where machine learning of the first model 61 is performed using the information processing device 20 will be described as an example.
  • the program of FIG. 34 may be executed by hardware different from the information processing device 20, and the first model 61 for which machine learning has been completed may be copied to the auxiliary storage device 23 via the network.
  • the first model 61 trained by one hardware can be used by a plurality of information processing devices 20.
  • the control unit 21 of the information processing device 20 acquires a training record used for training one epoch from the training DB (step S811).
  • the control unit 21 adjusts the model parameters so that when the input data is input to the input layer of the model, the output data is output from the output layer layer (step S812).
  • the control unit 21 determines whether or not to end the process (step S813). For example, when the control unit 21 finishes learning a predetermined number of epochs, it determines that the process is finished.
  • the control unit 21 may acquire test data from the training DB, input it to the model being machine-learned, and determine that the process ends when an output with a predetermined accuracy is obtained.
  • control unit 21 If it is determined that the process is not completed (NO in step S813), the control unit 21 returns to step S811. When it is determined that the process is completed (YES in step S813), the control unit 21 records the parameters of the trained model in the auxiliary storage device 23 (step S814). After that, the control unit 21 ends the process. By the above processing, a trained model is generated.
  • the present embodiment relates to a method of generating the second model 62 described with reference to FIG. The description of the parts common to the third embodiment will be omitted.
  • FIG. 35 is an explanatory diagram illustrating the record layout of the training DB of the fifth embodiment.
  • the training DB has an input data feel and an output data field.
  • a medical image is recorded in the input data field.
  • lesions determined by interpreting a medical image by a specialist such as a doctor are recorded.
  • the training DB a large amount of combinations of medical images and lesions determined by experts and the like are recorded.
  • the training DB is prepared for each combination of a device for capturing a medical image and a portion to be imaged.
  • the outline of the generation method of the second model 62 of the present embodiment will be described.
  • an unlearned model having a CNN structure including, for example, a repetition of a convolutional layer and a pooling layer, a fully connected layer, and a softmax layer is prepared.
  • the trained second model 62 is generated by the same processing flow as the processing described using the flowchart of FIG. 34 except that the structure of the prepared model is different.
  • the second model 62 can be generated by machine learning.
  • the present embodiment relates to a method of generating the third model 63 described with reference to FIG. The description of the parts common to the fifth embodiment will be omitted.
  • FIG. 36 is an explanatory diagram illustrating the record layout of the training DB of the sixth embodiment.
  • the training DB has an input data feel and an output data field. A medical image is recorded in the input data field.
  • the output data field records the treatment strategy, such as the treatment method or the specifications of the equipment used, as determined by an expert with sufficient experience in catheter treatment.
  • the training DB a large amount of combinations of medical images and treatment strategies are recorded.
  • the training DB is prepared for each combination of the device that captured the medical image, the site where the medical image was captured, the lesion to be treated, and the time when the medical image was captured.
  • the outline of the generation method of the third model 63 of the present embodiment will be described.
  • an unlearned model having a CNN structure including, for example, a repetition of a convolutional layer and a pooling layer, a fully connected layer, and a softmax layer is prepared.
  • the trained third model 63 is generated by the same processing flow as the processing described using the flowchart of FIG. 34 except that the structure of the prepared model is different.
  • the third model 63 can be generated by machine learning.
  • FIG. 37 is a functional block diagram of the information processing system 10 of the seventh embodiment.
  • the information processing system 10 includes a patient information acquisition unit 81, an action acquisition unit 82, and a display unit 83.
  • the patient information acquisition unit 81 acquires patient information regarding a patient undergoing catheter treatment.
  • the action acquisition unit 82 sequentially acquires the action related to the catheter treatment for the patient and the time when the action is performed.
  • the display unit 83 causes the first display device to display the patient information acquired by the patient information acquisition unit 81 and the time-series information in which the actions acquired by the action acquisition unit 82 are arranged in time series.
  • FIG. 38 is an explanatory diagram illustrating the configuration of the information processing device 20 according to the eighth embodiment.
  • the present embodiment relates to a mode in which the information processing apparatus 20 of the present embodiment is realized by operating a general-purpose computer 90 and a program 97 in combination. The description of the parts common to the first embodiment will be omitted.
  • the computer 90 includes a control unit 21, a main storage device 22, an auxiliary storage device 23, a communication unit 24, a reading unit 29, and a bus.
  • the computer 90 is an information device such as a general-purpose personal computer, a tablet, a smartphone, or a server computer.
  • Program 97 is recorded on the portable recording medium 96.
  • the control unit 21 reads the program 97 via the reading unit 29 and stores it in the auxiliary storage device 23. Further, the control unit 21 may read the program 97 stored in the semiconductor memory 98 such as the flash memory mounted in the computer 90. Further, the control unit 21 may download the program 97 from the communication unit 24 and another server computer (not shown) connected via a network (not shown) and store the program 97 in the auxiliary storage device 23.
  • the part of the program 97 that is executed by the computer 90 is installed as a control program of the computer 90, loaded into the main storage device 22, and executed.
  • the computer 90 functions as the information processing device 20 described above.
  • the portion of the program 97 that is executed by the terminal device 30 is transmitted via the network and installed as a control program of the terminal device 30.
  • the terminal device 30 performs the above-mentioned function in cooperation with the information processing device 20.
  • the computer 90 is a general-purpose personal computer, tablet, smartphone, large computer, virtual machine running on the large computer, cloud computing system, or quantum computer.
  • the computer 90 may be a plurality of personal computers or the like that perform distributed processing.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Epidemiology (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Primary Health Care (AREA)
  • Public Health (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)
PCT/JP2021/009236 2020-03-30 2021-03-09 情報処理方法、プログラムおよび情報処理システム WO2021199952A1 (ja)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2022511724A JP7699105B2 (ja) 2020-03-30 2021-03-09 情報処理方法、プログラムおよび情報処理システム

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020061516 2020-03-30
JP2020-061516 2020-03-30

Publications (1)

Publication Number Publication Date
WO2021199952A1 true WO2021199952A1 (ja) 2021-10-07

Family

ID=77928538

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/009236 WO2021199952A1 (ja) 2020-03-30 2021-03-09 情報処理方法、プログラムおよび情報処理システム

Country Status (2)

Country Link
JP (1) JP7699105B2 (enrdf_load_stackoverflow)
WO (1) WO2021199952A1 (enrdf_load_stackoverflow)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2025141844A1 (ja) * 2023-12-28 2025-07-03 三菱電機株式会社 支援システム、サーバ装置、及び支援方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003263495A (ja) * 2002-03-08 2003-09-19 Hitachi Medical Corp 医療情報管理システム
JP2004154341A (ja) * 2002-11-06 2004-06-03 Sysmex Corp 糖尿病診断支援システム
JP2008188329A (ja) * 2007-02-07 2008-08-21 Konica Minolta Medical & Graphic Inc 診断支援システム
JP2015097065A (ja) * 2013-11-15 2015-05-21 株式会社東芝 手術情報管理装置
JP2019180822A (ja) * 2018-04-10 2019-10-24 Dgshape株式会社 手術用器具管理システム

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003263495A (ja) * 2002-03-08 2003-09-19 Hitachi Medical Corp 医療情報管理システム
JP2004154341A (ja) * 2002-11-06 2004-06-03 Sysmex Corp 糖尿病診断支援システム
JP2008188329A (ja) * 2007-02-07 2008-08-21 Konica Minolta Medical & Graphic Inc 診断支援システム
JP2015097065A (ja) * 2013-11-15 2015-05-21 株式会社東芝 手術情報管理装置
JP2019180822A (ja) * 2018-04-10 2019-10-24 Dgshape株式会社 手術用器具管理システム

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2025141844A1 (ja) * 2023-12-28 2025-07-03 三菱電機株式会社 支援システム、サーバ装置、及び支援方法

Also Published As

Publication number Publication date
JP7699105B2 (ja) 2025-06-26
JPWO2021199952A1 (enrdf_load_stackoverflow) 2021-10-07

Similar Documents

Publication Publication Date Title
US20250195032A1 (en) Intraluminal ultrasound navigation guidance and associated devices, systems, and methods
JP7206024B2 (ja) 解剖モデリングシステム及びその作動方法
JP2022525469A (ja) 動脈の撮像・評価のシステム及び方法並びに関連するユーザインタフェースに基づくワークフロー
US10902941B2 (en) Interventional radiology structured reporting workflow utilizing anatomical atlas
JP2021142320A (ja) 血管造影投影をコンピュータ断層撮影データに位置合わせするためのシステムおよび方法
US20110173027A1 (en) Health-risk metric determination and/or presentation
US11049595B2 (en) Interventional radiology structured reporting workflow
JP2023123542A (ja) 血管抽出装置および血管抽出方法
WO2021199952A1 (ja) 情報処理方法、プログラムおよび情報処理システム
CN119234278A (zh) 使用cath lab图像进行医师培训和沟通
WO2023052278A1 (en) Intraluminal ultrasound vessel segment identification and associated devices, systems, and methods
WO2021199960A1 (ja) プログラム、情報処理方法、および情報処理システム
Sra et al. Identifying the third dimension in 2D fluoroscopy to create 3D cardiac maps
JP7644092B2 (ja) プログラム、情報処理方法、学習モデルの生成方法、学習モデルの再学習方法、および、情報処理システム
JP7313888B2 (ja) 医用情報処理装置、医用情報処理方法及びプログラム
US20050222509A1 (en) Electrophysiology system and method
JP7577734B2 (ja) プログラム、情報処理方法および情報処理装置
JP7737357B2 (ja) プログラム、情報処理方法、および情報処理システム
JP7701193B2 (ja) 医用画像処理装置及び方法
JP2022142607A (ja) プログラム、画像処理方法、画像処理装置及びモデル生成方法
EP4588057A1 (en) Clinical device tracking and monitoring system
JP2024142136A (ja) プログラム、情報処理方法、情報処理装置および学習済モデル生成方法
JP2023151428A (ja) プログラム、学習済モデル生成方法、情報処理方法および情報処理装置
CN119173956A (zh) 导管室规程的视频和音频捕获
WO2024263539A1 (en) High resolution synthetic medical imaging

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21779705

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022511724

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21779705

Country of ref document: EP

Kind code of ref document: A1