WO2021220872A1 - System, method, and program for surgery operating information management - Google Patents

System, method, and program for surgery operating information management Download PDF

Info

Publication number
WO2021220872A1
WO2021220872A1 PCT/JP2021/015944 JP2021015944W WO2021220872A1 WO 2021220872 A1 WO2021220872 A1 WO 2021220872A1 JP 2021015944 W JP2021015944 W JP 2021015944W WO 2021220872 A1 WO2021220872 A1 WO 2021220872A1
Authority
WO
WIPO (PCT)
Prior art keywords
person
role
surgery
data
information management
Prior art date
Application number
PCT/JP2021/015944
Other languages
French (fr)
Japanese (ja)
Inventor
卓也 中村
毅 前田
弘充 松浦
加奈 松浦
Original Assignee
ソニーグループ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーグループ株式会社 filed Critical ソニーグループ株式会社
Publication of WO2021220872A1 publication Critical patent/WO2021220872A1/en

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/20ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the management or administration of healthcare resources or facilities, e.g. managing hospital staff or surgery rooms

Definitions

  • This disclosure relates to a surgical operation information management system, a surgical operation information management method and a program.
  • Patent Document 1 can only determine whether or not the medical staff is in charge of the patient, and cannot determine the role of the medical staff in the operating room, the work content, or the like. rice field.
  • the staff in the operating room wear almost the same clothes, and it is not possible to identify an individual (person) with general technology.
  • work records are not kept, so while reducing the work processing burden as much as possible, work information such as work content, work time, and number of work for each role of each staff is collected, and the operation room is operated. It is desired to improve the efficiency of surgery-related work by providing the information necessary for the improvement of the operation.
  • This technology was made in view of such a situation, and aims to provide a surgery operation information management system, a surgery operation information management method, and a program capable of improving the efficiency of surgery-related work. ..
  • the surgical operation information management system of the embodiment includes a data processing unit that acquires information related to a person included in a captured image in an operating room or an operating room-related area or information related to an object included in the captured image, and the person. It is provided with a situation determination unit for estimating the role of a person involved in the operation in the operation based on the information related to the item and the information related to the object.
  • FIG. 1 is a schematic block diagram of the surgical operation information management system of the embodiment.
  • FIG. 2 is a detailed block diagram of a main part of the surgical operation information management system of the first embodiment.
  • FIG. 3 is an explanatory diagram of an example of the determination database.
  • FIG. 4 is a flowchart of the determination database construction process.
  • FIG. 5 is an outline processing flowchart of the process of estimating the role of a person.
  • FIG. 6 is a detailed processing flowchart of a main part of the processing for estimating the role of a person.
  • FIG. 7 is an explanatory diagram of a specific example of role estimation.
  • FIG. 8 is an explanatory diagram (preoperative and intraoperative) of a visualization example of work contents.
  • FIG. 9 is an explanatory diagram (postoperative) of a visualization example of the work content.
  • FIG. 10 is a detailed block diagram of a main part of the surgical operation information management system of the second embodiment.
  • FIG. 11 is an explanatory diagram of an example of data to be added in the second embodiment.
  • FIG. 12 is an explanatory diagram of a configuration example of the determination database of the second embodiment.
  • FIG. 13 is a detailed block diagram of a main part of the surgical operation information management system of the third embodiment.
  • FIG. 14 is an explanatory diagram of an example of acquired data of the data processing unit of the third to fifth embodiments.
  • FIG. 15 is a detailed block diagram of a main part of the surgical operation information management system of the fourth embodiment.
  • FIG. 16 is a detailed block diagram of a main part of the surgical operation information management system according to the fifth embodiment.
  • FIG. 1 is a schematic block diagram of the surgical operation information management system of the embodiment.
  • the surgical operation information management system 10 performs data processing based on the imaging data and a data collecting device 11 equipped with one or a plurality of cameras that capture images in the operating room before, during, and after surgery and output imaging data.
  • a management server 14 having a data processing device 12 for determining the operation status of surgery, a determination database (DB) 13 for storing determination data used for determining the operation status of surgery, and the determined operation status of surgery are visualized. It includes an information presentation system 15 for presenting.
  • DB determination database
  • the surgical operation information management system 10 is connected to the in-hospital system 20 which can operate in cooperation with each other as needed via a communication network.
  • the data processing device 12 processes the imaging data input from the camera, performs various devices, a worker who sets the carry-in of surgical instruments, and a doctor (surgeon, assistant, support) who is actually involved in the surgery. Determine the role of a person such as a doctor) or a nurse.
  • FIG. 2 is a detailed block diagram of a main part of the surgical operation information management system of the first embodiment.
  • the data collection device 11 includes a plurality of cameras 11A.
  • Types of the camera 11A used in the data collection device 11 include a camera having a video function using a CCD / CMOS sensor, a camera having an interval shooting mirror, a camera capable of shooting a moving image or a still image by wire or wirelessly, and the like. Can be mentioned. It is also possible to use a camera having an IR (near infrared) sensor for distance measurement by TOF (Time Of Flight) or the like.
  • IR near infrared
  • the position of the camera in the operating room is such that the operating table in the operating room can be used as the main position for taking a panoramic view of the state of surgery. Although it depends on the size and floor plan of the operating room, the layout of the operating table, and the imaging range of the camera used, it will be installed in a position that does not interfere with the surgery and the flow lines of each staff member.
  • a camera provided outside the operating room may be installed in a corridor outside the operating room on the floor of the operating unit, an equipment room, or the like. That is, it can be arbitrarily installed as long as it is possible to image at least the operating room in the area related to the operation in the operating room and outside the operating room.
  • compressed image formats such as PEG
  • still image data in uncompressed image formats such as BMP
  • moving image data such as MPEG
  • still image data cut out from the moving image data and the like are used.
  • the data processing device 12 includes a data processing unit 12A and a status determination unit 12B.
  • the data processing unit 12A discriminates an object (person and object) from the captured image corresponding to the captured data acquired by the data collection device 11, and acquires the posture of the person, the position of the person and the object, and the time as basic data.
  • the posture of the person and the position of the person and the object are detected by using a general object recognition process by deep learning such as YOLO, a posture / skeleton detection algorithm such as OpenPose, and the like.
  • the time can be obtained from the time code of the imaging data.
  • examples of objects other than the person to be discriminated include medical devices, medical instruments, medical materials, etc. that are placed in the operating room or carried in the operating room.
  • Medical devices include monitors, trolleys, endoscopic light sources, endoscopic camera control units (endoscopic CCU), endoscopic camera heads (endoscopic CH), microscopes, ultrasonic diagnostic machines, sphygmomanometers, and pulses. Examples include an oximeter, an electrocardiograph, a thermometer (rectal thermometer, etc.), a urinary catheter, a ventilation mask, and the like.
  • instruments include forceps, grasping forceps, needle holders, tweezers, scissors, scalpels, electrosurgical scalpels (bipolar, monopolar), laser scalpels, ultrasonic coagulation and incision devices, staplers, troccers, throat mirrors, and the like.
  • Examples of medical materials include gauze, needles, tissue adhesive seals, physiological saline, various infusions, blood for transfusion, oxygen cylinders, laughing gas cylinders, and the like.
  • the two-dimensional position (for example, (x, y)) obtained by projecting the position of the object onto the floor, wall, ceiling, or the like of the operating room, or the object.
  • 3D positions in the operating room eg, (x, y, z) are included.
  • a direct distance measuring technique such as TOF or LiDAR (Light Detection And Ringing) may be used.
  • the determination database 13 stores the data acquired in the past cases for each case.
  • FIG. 3 is an explanatory diagram of an example of the determination database.
  • the determination database 13 includes a plurality of case-specific data 21-1 to 21-n. Since the case-specific data 21-1 to 21-n have the same configuration, the case-specific data 21-1 will be described below as an example.
  • a case ID (xxx-yyyy in the example of FIG. 3) for identifying the case is assigned to the case-specific data 21-1, and includes role data 31, work content data 32, and time data 33. There is.
  • the role data 31 is data representing the role of a person in the operation of the case.
  • the role is an attribute indicating the position (expected work content or job) assigned to the target person in the surgery.
  • Examples of the role data 31 include the following roles. (1) Surgeon (2) Assistant (3) Direct caregiver (instrument delivery) (4) Indirect caregiving (outside) (5) Anesthesiologist (6) Peri-anesthesia nurse (7) Clinical Engineer (8) Operating room staff (carry-in and carry-out) (9) Contractor (cleaning, disinfection, assistance, etc.)
  • the work content data 32 is data representing the work content of each person.
  • the work content data 32 includes, for example, the following roles.
  • the situation determination unit 12B compares the data acquired by the data processing unit 12A with the case-specific data 21-1 to 21-n, which are the data of the past cases stored in the determination database 13, and thereby the data processing unit.
  • Each person in the data acquired in 12A estimates the corresponding role and work content.
  • the role and the work content may be estimated based on the information of the temporal change obtained by using the imaging data of the front and rear frames and the imaging data of a plurality of images.
  • the situation determination unit 12B includes the time from the start of work to the end of work (for example, the time from the start of suturing to the end of suturing) for each role, and key events for surgery. Calculate the relative elapsed time based on (sign-in, timeout, sign-out, etc.).
  • the information presentation system 15 includes an information presentation processing unit 15A that performs processing for presenting various information, and a display 15B that presents various information under the control of the information presentation processing unit 15A.
  • FIG. 4 is a flowchart of the determination database construction process.
  • the database builder first prepares for shooting (step S11). Specifically, one or more cameras 11A are installed in the target operating room. Then, the position information of the installed camera 11A is acquired.
  • step S12 the actual surgery is performed, imaging is performed from the start to the end of the surgery, that is, before, during, and after the surgery, and the imaging data is collected in the management server 14 (step S12).
  • the data processing unit 12A of the data processing device 12 recognizes a person and an object such as an object from the captured image corresponding to the captured data acquired by the data collecting device 11 by general object recognition processing, and the person by a skeleton detection algorithm or the like. Acquires object recognition information that recognizes the posture, person, and object. In addition, the position information of a person and an object is acquired. Further, time information is acquired as basic data from the time code of the imaging data (step S13).
  • the skeleton recognition result for a person on the screen is acquired for each frame of the imaging data output by each camera 11A, and the person in the three-dimensional coordinate system set in the operating room is obtained from the relationship between the imaging positions of the cameras 11A. Calculate the position, body orientation and hand position.
  • 18 or more joints are detected as a skeleton, the coordinates of the joints constituting the skeleton in the screen are acquired, and the position of a person in the three-dimensional coordinate system set in the operating room from the relationship between the imaging positions of the cameras 11A. , Calculate the orientation of the body and the position of the hand.
  • the object recognition result for the object (person and object) in the screen is acquired for each frame of each camera, and the position of the object in the three-dimensional coordinate system set in the operating room is calculated from the relationship between the imaging positions of the cameras 11A. ..
  • each object is tagged with a person or an object (an object other than the person: for example, a medical device type), the position of the object (coordinates in the screen of the object) is acquired, and images are taken between the cameras 11A.
  • the position of the object in the three-dimensional coordinate system set in the operating room is calculated from the positional relationship.
  • the status determination unit 12B of the data processing device 12 sets the relationship between each person and the object between frames of the imaged data based on the orientation and position information of the person and the object by the above-described processing. For example, set as follows. (1) When the person and the object maintain the same positional relationship and have the same amount of movement: Set that the person is moving the object. (2) The object is in the same position and the person is facing the object: Set that the person is involved in the object (for example, performing an operation). (3) An object is held in the person's hand: Set that the person is holding the object. (4) Person lying on an object: Set the patient to be lying (sleeping) on a bed or operating table. (5) The person is facing the patient and the person's hand is in the same position as the patient: Set that the person is involved with the patient.
  • the area (area) where the person and the object are located is set based on the position information of the person and the object. Specifically, the area around the patient is set as a clean area, the area above the sterilized non-woven fabric detected as an object is set as a clean area, and the other areas are set as a dirty area.
  • the database builder collaborates with an expert to refer to the acquired object recognition information, position information, and time information, and perform tagging and correction (step S14). More specifically, based on the captured moving images, the work of each staff member (doctor, nurse, etc.), the equipment used by each staff member, the role of each staff member, the cleanliness area, the unclean area, etc. Tag.
  • the erroneous information is corrected.
  • step S15 For a plurality of operations, the processes of steps S11 to S14 described above are performed, a data set is constructed for each medical department and procedure, and the determination database 13 is used (step S15).
  • FIG. 5 is an outline processing flowchart of the process of estimating the role of a person.
  • the data processing worker first prepares for shooting (step S21). Specifically, one or more cameras 11A are installed in the target operating room. Then, the position information of the installed camera 11A is acquired. Alternatively, the acquired position information is acquired and updated for the operating room in which the camera 11A is already installed.
  • imaging is performed from the start to the end of the surgery, that is, before, during, and after the surgery, and the imaging data is collected in the management server 14 (step S22).
  • the data processing unit 12A of the data processing device 12 recognizes a person and an object such as an object from the captured image corresponding to the captured data acquired by the data collecting device 11 by general object recognition processing, and the person by a skeleton detection algorithm or the like. Acquires object recognition information that recognizes the posture, person, and object. In addition, the position information of a person and an object is acquired. Further, time information is acquired as basic data from the time code of the imaging data (step S23).
  • FIG. 6 is a detailed processing flowchart of a main part of the processing for estimating the role of a person.
  • the data processing unit 12A performs skeleton recognition for a person on the screen for each frame of the imaging data output by each camera 11A (step S31). Subsequently, the data processing unit 12A extracts the joint from the skeleton recognition result (step S32). Next, the data processing unit 12A extracts the joint position and calculates the position of the person, the orientation of the body, and the position of the hand in the three-dimensional coordinate system set in the operating room from the relationship between the imaging positions of the cameras 11A (step S33). ).
  • the data processing unit 12A extracts 18 or more joints from the skeleton detection result, acquires the coordinates of the joints constituting the skeleton in the screen, and sets them in the operating room based on the relationship between the imaging positions of the cameras 11A.
  • the position of the person, the orientation of the body, and the position of the hand in the three-dimensional coordinate system are calculated (step S34).
  • the data processing unit 12A recognizes an object other than a person on the screen for each frame of each camera (step S35).
  • each object is tagged with a person or an object (an object other than the person: for example, a medical device type), the position of the object (coordinates in the screen of the object) is acquired, and images are taken between the cameras 11A.
  • the position of the object in the three-dimensional coordinate system set in the operating room is calculated from the positional relationship (step S36).
  • the situation determination unit 12B refers to the determination database 13 and tags each object (type of object: for example, setting of type of medical device) (step S37).
  • the status determination unit 12B of the data processing device 12 calculates the positions of the person and the object in the operating room in a three-dimensional coordinate system based on the orientation and position information of the person and the object by the above-mentioned processing between the frames of the imaging data. (Step S38).
  • the status determination unit 12B of the data processing device 12 sets the positional relationship between the person and the corresponding object (step S39). Further, the situation determination unit 12B sets an area (clean area or unclean area) where people and objects exist (step S40).
  • the status determination unit 12B of the data processing device 12 uses the object recognition information, the position information, the time information, and the moving image corresponding to the captured data acquired in the process of step S23 as a determination database. It is compared with the data set stored in 13 and matched by AI (step S24).
  • the situation determination unit 12B tags each person included in the object recognition information with a role having a high matching rate as the role of the person (surgeon, direct caregiving, anesthesiologist, etc.) (step S25).
  • FIG. 7 is an explanatory diagram of a specific example of role estimation.
  • the data processing unit 12A of the data processing device 12 acquires the posture and position of a person, the type and position of equipment as an object, and the shooting time as basic data D0 for each frame. There is.
  • the frame data No. of FIG. 7 In the case of 1, the posture of the person A (joints 1x, y, ...) And the position of the person A (xx, yy) are stored, and the type A of the equipment M, the product name B, and the position of the equipment M (xw, yy) are stored. doing.
  • the determination database 13 stores the role, work content, posture, position, equipment used, equipment position, etc. of each person for the past cases as the determination data DR.
  • the role For example, as shown on the right side of FIG. 7, for the role: X, two data are stored, one of the data has a work content of F, and the posture of the worker (joints 1x, y, ... ), The position of the worker (ww, yy) is stored, and the type C of the equipment used, the product name D, and the position of the equipment used (xz, yy) are stored.
  • the work content is G
  • the posture of the worker joints 1x, y, (7)
  • the position of the worker are stored, and the type C of the equipment used, the product name D, and the product name D are stored.
  • the position (xw, yy) of the equipment used is stored.
  • the situation determination unit 12B compares the data acquired by the data processing unit 12A with the data stored in the determination database 13, and the work content and role of the person corresponding to the data acquired by the data processing unit 12A. Is estimated, and the time required for each work (working time) and the relative time for the event are calculated.
  • the posture of the person A has a high matching rate with the posture of the role X in the work G
  • the position of the person A has a high matching rate with the position of the role X in the work G
  • the equipment A has a high matching rate of the role X.
  • FIG. 8 is an explanatory diagram (preoperative and intraoperative) of a visualization example of work contents.
  • FIG. 9 is an explanatory diagram (postoperative) of a visualization example of the work content.
  • the progress status in real time which is the working time related to the patient, can be grasped, it is possible to predict the influence on the schedule of the surgery or the operating room and whether or not support is necessary. ..
  • FIG. 10 is a detailed configuration block diagram of a main part of the surgical operation information management system of the second embodiment.
  • the second embodiment of FIG. 10 differs from the first embodiment of FIG. 2 in that the patient estimation processing unit 12C that estimates the position and posture of the patient and the estimation result (position and posture) of the patient estimation processing unit 12C. It is a point provided with a weighting coefficient addition unit 12D that generates a weighting coefficient based on the surgical procedure information (information on the operation content) acquired from the in-hospital system, adds it to the estimation result of the patient estimation processing unit 12C, and outputs it. ..
  • the vector (direction, distance) of the person and the object for which the work content is estimated from the part such as the head side of the patient is almost determined for each surgical procedure. For example, if there is a person holding a laryngoscope within a few tens of centimeters toward the patient's head, that person is likely to be an anesthesiologist performing a tracheal intubation.
  • the weighting factor of the anesthesiologist is increased as the role of the person, and the weighting coefficient of tracheal intubation is increased as the work content. ..
  • the probability that the estimation in estimating the role and work content of the person is correct is possible.
  • the position and posture of the patient from the patient estimation processing unit 12C and the surgical procedure information acquired from the in-hospital system 20 are input to the weighting coefficient addition unit 12D, and the estimation result of the patient estimation processing unit 12C to which the weighting coefficient is added is displayed.
  • the determination unit 12B the vector from each part of the patient to the person or the object can be used for estimating the role of the person and the work content.
  • FIG. 11 is an explanatory diagram of an example of data to be added in the second embodiment.
  • Examples of the data acquired in addition to the basic data D0 in the second embodiment include the distance data DA1 between the patient and the person, the distance data DA2 between the patient and the equipment, and the vector data DA3 from the patient to the person.
  • the distance data DA1 is the distance between the patient and the person A, and xxxcm is stored. Further, the distance data DA2 is the distance between the patient and the equipment C, and yycm is stored. Further, the vector data DA3 is a vector from the patient to the person C, and the direction is xx ° and the distance is xx cm.
  • FIG. 12 is an explanatory diagram of a configuration example of the determination database of the second embodiment.
  • the determination database 13 at this time stores the surgical procedure data DR-A1, the surgeon data DR-A2, and the workflow data DR-A3 as the determination data DR-A for past cases.
  • the surgical procedure data DR-A1 stores the surgical procedure xxx
  • the surgeon data DR-A2 stores the surgeon yyyy.
  • the workflow data DR-A3 in the example of FIG. 12, the preoperative workflow data DR-A31 and the intraoperative workflow data DR-A32 are stored.
  • the workflow data DR-A31 and the intraoperative workflow data DR-A32 store one or more detailed data for the pattern of arrangement of people and objects (for example, pattern A), respectively.
  • FIG. 13 is a detailed block diagram of a main part of the surgical operation information management system of the third embodiment.
  • the third embodiment of FIG. 13 differs from the first embodiment of FIG. 2 in that the weighting coefficient is set based on the surgical procedure information (information on the surgical content) from the operating room system 20A constituting the in-hospital system 20. This is a point provided with a weighting coefficient addition unit 12E that is generated and output.
  • the weighting of the work content is changed and output to the situation determination unit 12B, so that the situation determination unit 12B can perform the event of each work.
  • the relative time from can be used for estimating the work content, and it is possible to increase the probability that the estimation in estimating the role of the person and the work content is correct.
  • the surgical procedure information corresponding to the surgeon from the operating room system 20A is input to the weighting coefficient addition unit 12E, the weighting of the work content is changed, and the operation information is output to the situation determination unit 12B.
  • the medical technique can be used to estimate the work content, and it is possible to increase the probability that the estimation in estimating the role of the person and the work content is correct.
  • FIG. 14 is an explanatory diagram of an example of acquired data of the data processing unit of the third to fifth embodiments.
  • the data processing unit 12A of the data processing device 12 includes basic data D0 (see FIG. 7) including the posture and position of a person, the type and position of equipment as an object, and the shooting time for each frame, for example, as shown in FIG.
  • surgical procedure data D1, surgeon data D2, equipment information data D3, and color information data D4 have been acquired.
  • the data processing unit 12A has at least basic data D0, surgical procedure data D1 (surgical procedure xxx in the example of FIG. 12) and surgeon data D2 (in the example of FIG. 12). , Surgeon yyy).
  • the accuracy of estimating the role of the person and the work content can be improved.
  • FIG. 15 is a detailed configuration block diagram of a main part of the surgical operation information management system of the fourth embodiment.
  • the fourth embodiment of FIG. 15 differs from the first embodiment of FIG. 2 from the surgical procedure information (surgery content information) from the operating room system 20A constituting the in-hospital system 20 and the operating room device 41. It is a point provided with a weighting coefficient addition unit 12F that generates and outputs a weighting coefficient based on the equipment information of the above.
  • the data processing unit 12A has at least basic data D0 and equipment information data D3 among the acquired data D0 to D4 shown in FIG. 14 (in the example of FIG. 12, the function related to the equipment A (in the example of FIG. 12)
  • functions A and B operation start time (time xx: ya in the example of FIG. 14), operation end time (time xx: yb in the example of FIG. 14), and functions related to equipment B (FIG. 14).
  • the equipment used and the operating time of the equipment used are almost determined for each surgical procedure. For example, if there is a person who is operating a button on the monitor and the operation time (or the time from the start of the operation to the end of the operation) is close to the operation start time of the monitor, the operation is the monitor setting work at the time of loading. be. Further, if the operation time is close to the operation end time, there is a high possibility of the end work at the time of carrying out.
  • the situation determination unit 12B can use the operating status of the operating room device 41 for estimating the work content, and can increase the probability that the estimation in estimating the role of the person and the work content is correct.
  • the accuracy of estimating the role of the person and the work content is improved. be able to.
  • FIG. 16 is a detailed block diagram of a main part of the surgical operation information management system of the fifth embodiment.
  • the fifth embodiment of FIG. 16 differs from the first embodiment of FIG. 2 in that the color identification processing unit 12G that outputs color information based on the imaging data and the operating room system 20A that constitutes the in-hospital system 20. It is a point provided with a weighting coefficient addition unit 12H that generates and outputs a weighting coefficient based on the surgical procedure information (information on the operation content) from the above and the color information from the color identification processing unit 12G.
  • the color identification processing unit 12G acquires the color in the image together with the position where the color is detected by performing color detection using Open CV (Open Source Computer Vision Library) or the like, and uses it as color information. Output.
  • Open CV Open Source Computer Vision Library
  • Examples of the color data format include RGB, YUB, YPbPr (YcbCr), HSV, and the like.
  • the data processing unit 12A has at least at least basic data D0 and color information data D4 (in the example of FIG. 12, the color of position A) among the acquired data D0 to D4 shown in FIG. It suffices to acquire A, the color A of the position B, and the color C of the position C).
  • the color of the clothes may be determined by the roles of the staff (doctors, nurses, operating room staff, etc.). Therefore, by inputting the surgical procedure information from the operating room system 20A and the color information from the color identification processing unit 12G into the weighting coefficient addition unit 12H, the weighting of the role of each person according to the color of the clothes is changed and the situation determination unit 12B Output to.
  • the situation determination unit 12B can use the color of the person's clothes for estimating the work content, and can increase the probability that the estimation in estimating the role of the person and the work content is correct.
  • the accuracy of estimating the role of the person and the work content is improved. be able to.
  • the management server 14 has been described as one device, but it can also be constructed as a cloud server composed of a plurality of devices.
  • a camera is used as a sensor as a data collection device for determining the role of a person (for example, an operator)
  • any sensor can be used as long as it can determine the role. It is also possible.
  • the present technology can also have the following configurations.
  • a data processing unit that acquires information about a person or object included in a captured image of an area related to surgery, A situation determination unit that estimates the role of the person in the surgery based on the information about the person or the object.
  • Surgical operation information management system equipped with (2) The data processing unit acquires at least one of the position information of the person or the object, the discrimination result of the person or the object, and the posture information of the person as the information about the person or the object.
  • the surgical operation information management system described in (1).
  • a judgment database that stores judgment data including at least one of the position, posture, equipment used, and work content of the person for each role for a plurality of cases is provided.
  • the situation determination unit estimates the role of the person in the surgery by comparing the information about the person or object with the determination data.
  • the surgical operation information management system according to (1) or (2).
  • the situation determination unit estimates the role based on at least one of the positional relationship between the patient and the person who is the target of the operation and the positional relationship between the person and the object.
  • the surgical operation information management system according to any one of (1) to (3).
  • the situation determination unit estimates the role for each surgical procedure based on the position of the person or the object with respect to the patient who is the target of the surgery.
  • the surgical operation information management system according to any one of (1) to (4).
  • the situation determination unit estimates the role for each surgical procedure based on a vector from the patient who is the target of the surgery to the person or the object.
  • the surgical operation information management system according to any one of (1) to (4). (7)
  • the situation determination unit estimates the role for each surgical procedure based on the relative time from each work step in the surgery.
  • the surgical operation information management system according to any one of (1) to (4).
  • the situation determination unit estimates the role for each operation in consideration of the surgical procedure associated with the surgeon in charge of the operation.
  • the surgical operation information management system according to any one of (1) to (4).
  • the situation determination unit estimates the role based on the operating status of the equipment used for the surgery for each surgical procedure.
  • the surgical operation information management system according to any one of (1) to (4).
  • a data collecting device for acquiring the captured image by imaging the region related to the surgery is provided.
  • the surgical operation information management system according to any one of (1) to (9).
  • the data collection device includes a plurality of cameras that output the captured image.
  • (12) A presentation unit that presents information about the role.
  • the person is a surgeon,
  • (14) The process of acquiring information about a person or object contained in a captured image of an area related to surgery, The process of estimating the role of the person in the surgery based on the information about the person or object, and Surgical operation information management method equipped with.
  • the surgical operation information management method refers to a judgment database that stores judgment data including at least one of the position, posture, equipment used, and work content of the person for each role for a plurality of cases.
  • the role of the person in the surgery is estimated by comparing the information about the person or object with the determination data.
  • the process of estimating the role estimates the role based on at least one of the positional relationship between the patient and the person to be operated on and the positional relationship between the person and the object.
  • the surgical operation information management method according to any one of (14) to (16).
  • the role is estimated based on the position of the person or the object with respect to the patient who is the target of the operation for each surgical procedure.
  • the surgical operation information management method according to any one of (14) to (17).
  • the role is estimated based on the vector from the patient who is the target of the operation to the person or the object for each surgical procedure.
  • the surgical operation information management method according to any one of (14) to (17).
  • the process of estimating the role estimates the role for each surgical procedure based on the relative time from each work step in the surgery.
  • (21) In the process of estimating the role the role is estimated for each operation in consideration of the surgical procedure associated with the surgeon in charge of the operation.
  • (22) In the process of estimating the role the role is estimated based on the operating status of the equipment used for the surgery for each surgical procedure.
  • (23) With the process of presenting information about the role The surgical operation information management method according to any one of (14) to (22).
  • the person is a surgeon, The surgical operation information management method according to any one of (14) to (23).
  • a program for controlling the surgical operation information management system with a computer A means for obtaining information about a person or an object included in a captured image of an area related to surgery using the computer. Means for estimating the role of the person in the surgery based on information about the person or object, and A program that works.
  • the means for acquiring the information acquires at least one of the position information of the person or the object, the discrimination result of the person or the object, and the posture information of the person as the information about the person or the object.
  • the surgical operation information management system includes a judgment database that stores judgment data including at least one of the position, posture, equipment used, and work content of the person for each role for a plurality of cases.
  • the means for estimating the role estimates the role of the person in the surgery by comparing the information about the person or object with the determination data.
  • the program according to (25) or (26).
  • the means for estimating the role estimates the role based on at least one of the positional relationship between the patient and the person to be operated on and the positional relationship between the person and the object.
  • the program according to any one of (25) to (27).
  • the means for estimating the role estimates the role based on the position of the person or the object with respect to the patient who is the target of the operation for each surgical procedure.
  • the means for estimating the role estimates the role for each surgical procedure based on a vector from the patient who is the target of the surgery to the person or the object.
  • the program according to any one of (25) to (28). (31)
  • the means for estimating the role estimates the role for each surgical procedure based on the relative time from each work step in the surgery.
  • the means for estimating the role is to estimate the role for each operation in consideration of the surgical procedure associated with the surgeon in charge of the operation.
  • the means for estimating the role estimates the role based on the operating status of the equipment used for the surgery for each surgical procedure.
  • Surgical operation information management system 11 Data collection device 11A Camera 12 Data processing device 12A Data processing unit 12B Situation judgment unit 12C Patient estimation processing unit 12D-12F Weighting coefficient addition unit 12G Color identification processing unit 12H Weighting coefficient addition unit 13 Judgment database 14 Management server 15 Information presentation system 15A Information presentation processing unit 15B Display 20 In-hospital system 20A Operating room system 21 Case-specific data 31 Role data 32 Work content data 33 Time data 41 Operating room equipment D0 Basic data D1 Operational data D2 Surgeon data D3 Equipment information data D4 Color information data DA1 Distance data DA2 Distance data DA3 Vector data DR Judgment data

Abstract

This system for surgery operating management of an embodiment comprises: a data processing unit for obtaining information on a person or an object included in a captured image of a region related to a surgery; and a situation determination unit for, on the basis of the information on the person or the object, estimating a role of the person in the surgery.

Description

手術運用情報管理システム、手術運用情報管理方法及びプログラムSurgical operation information management system, surgical operation information management method and program
 本開示は、手術運用情報管理システム、手術運用情報管理方法及びプログラムに関する。 This disclosure relates to a surgical operation information management system, a surgical operation information management method and a program.
 最適な手術計画を行うためには、手術に関係する各種装置の手配や、各スタッフの作業内容の把握が必要となる。
 またさらなる運用の改善のための情報提供がなされることが望まれる。
In order to make an optimal surgery plan, it is necessary to arrange various devices related to surgery and to understand the work contents of each staff member.
It is also hoped that information will be provided to further improve operations.
特開2018-196627号公報JP-A-2018-196627
 しかしながら、特許文献1記載の技術は、医療従事者が患者の担当であるか否かを判定できるに過ぎず、手術室内での医療従事者の役割や、作業内容等を判別することはできなかった。
 また、手術室内のスタッフは、ほぼ同一の着衣であり、一般的な技術では個人(人物)の識別ができない。さらに現状では、作業記録はとられていないため、できる限り作業処理負担を低減しつつ、各スタッフの役割毎の作業内容、作業時間、作業回数等の作業情報の収集を行い、手術室の運用の改善に必要な情報の提供を行うことによる手術関連業務の効率化が望まれている。
However, the technique described in Patent Document 1 can only determine whether or not the medical staff is in charge of the patient, and cannot determine the role of the medical staff in the operating room, the work content, or the like. rice field.
In addition, the staff in the operating room wear almost the same clothes, and it is not possible to identify an individual (person) with general technology. Furthermore, at present, work records are not kept, so while reducing the work processing burden as much as possible, work information such as work content, work time, and number of work for each role of each staff is collected, and the operation room is operated. It is desired to improve the efficiency of surgery-related work by providing the information necessary for the improvement of the operation.
 本技術は、このような状況に鑑みてなされたものであり、手術関連業務の効率化を図ることが可能な手術運用情報管理システム、手術運用情報管理方法及びプログラムを提供することを目的としている。 This technology was made in view of such a situation, and aims to provide a surgery operation information management system, a surgery operation information management method, and a program capable of improving the efficiency of surgery-related work. ..
 実施形態の手術運用情報管理システムは、手術室内あるいは手術室関連領域における撮像画像に含まれる人物に関連する情報あるいは前記撮像画像に含まれる物に関連する情報を取得するデータ処理部と、前記人物に関連する情報及び前記物に関連する情報に基づいて、手術に関わる人物の前記手術における役割を推定する状況判定部と、を備える。 The surgical operation information management system of the embodiment includes a data processing unit that acquires information related to a person included in a captured image in an operating room or an operating room-related area or information related to an object included in the captured image, and the person. It is provided with a situation determination unit for estimating the role of a person involved in the operation in the operation based on the information related to the item and the information related to the object.
図1は、実施形態の手術運用情報管理システムの概要構成ブロック図である。FIG. 1 is a schematic block diagram of the surgical operation information management system of the embodiment. 図2は、第1実施形態の手術運用情報管理システムの要部詳細構成ブロック図である。FIG. 2 is a detailed block diagram of a main part of the surgical operation information management system of the first embodiment. 図3は、判定用データベースの一例の説明図である。FIG. 3 is an explanatory diagram of an example of the determination database. 図4は、判定用データベースの構築処理フローチャートである。FIG. 4 is a flowchart of the determination database construction process. 図5は、人物の役割(ロール)を推定する処理の概要処理フローチャートである。FIG. 5 is an outline processing flowchart of the process of estimating the role of a person. 図6は、人物の役割(ロール)を推定する処理の要部詳細処理フローチャートである。FIG. 6 is a detailed processing flowchart of a main part of the processing for estimating the role of a person. 図7は、役割推定の具体例の説明図である。FIG. 7 is an explanatory diagram of a specific example of role estimation. 図8、作業内容の可視化例の説明図(術前、術中)である。FIG. 8 is an explanatory diagram (preoperative and intraoperative) of a visualization example of work contents. 図9は、作業内容の可視化例の説明図(術後)である。FIG. 9 is an explanatory diagram (postoperative) of a visualization example of the work content. 図10は、第2実施形態の手術運用情報管理システムの要部詳細構成ブロック図である。FIG. 10 is a detailed block diagram of a main part of the surgical operation information management system of the second embodiment. 図11は、第2実施形態において、追加されるデータの一例の説明図である。FIG. 11 is an explanatory diagram of an example of data to be added in the second embodiment. 図12は、第2実施形態の判定用データベースの構成例の説明図である。FIG. 12 is an explanatory diagram of a configuration example of the determination database of the second embodiment. 図13は、第3実施形態の手術運用情報管理システムの要部詳細構成ブロック図である。FIG. 13 is a detailed block diagram of a main part of the surgical operation information management system of the third embodiment. 図14は、第3実施形態~第5実施形態のデータ処理部の取得データの一例の説明図である。FIG. 14 is an explanatory diagram of an example of acquired data of the data processing unit of the third to fifth embodiments. 図15は、第4実施形態の手術運用情報管理システムの要部詳細構成ブロック図である。FIG. 15 is a detailed block diagram of a main part of the surgical operation information management system of the fourth embodiment. 図16は、第5実施形態の手術運用情報管理システムの要部詳細構成ブロック図である。FIG. 16 is a detailed block diagram of a main part of the surgical operation information management system according to the fifth embodiment.
 以下、図面を参照して、実施形態について詳細に説明する。
 図1は、実施形態の手術運用情報管理システムの概要構成ブロック図である。
 手術運用情報管理システム10は、術前、術中及び術後にわたって手術室内の撮像を行い撮像データを出力する一又は複数のカメラを備えたデータ収集装置11と、撮像データに基づいてデータ処理を行い、手術運用状況を判定するデータ処理装置12と、手術運用状況の判定に用いる判定用データを記憶した判定用データベース(DB)13を有する管理サーバ14と、判定された手術運用状況を可視化して提示する情報提示システム15と、を備えている。
Hereinafter, embodiments will be described in detail with reference to the drawings.
FIG. 1 is a schematic block diagram of the surgical operation information management system of the embodiment.
The surgical operation information management system 10 performs data processing based on the imaging data and a data collecting device 11 equipped with one or a plurality of cameras that capture images in the operating room before, during, and after surgery and output imaging data. , A management server 14 having a data processing device 12 for determining the operation status of surgery, a determination database (DB) 13 for storing determination data used for determining the operation status of surgery, and the determined operation status of surgery are visualized. It includes an information presentation system 15 for presenting.
 さらに手術運用情報管理システム10は、必要に応じて連携して動作可能な院内システム20に通信ネットワークを介して接続されている。 Further, the surgical operation information management system 10 is connected to the in-hospital system 20 which can operate in cooperation with each other as needed via a communication network.
 上記構成において、データ処理装置12は、カメラから入力された撮像データに対して処理を行って、各種装置、手術器具の搬入セッティング行う作業者、実際に手術に関わる医者(執刀医、助手、サポート医)、看護師等の人物の役割を判定する。 In the above configuration, the data processing device 12 processes the imaging data input from the camera, performs various devices, a worker who sets the carry-in of surgical instruments, and a doctor (surgeon, assistant, support) who is actually involved in the surgery. Determine the role of a person such as a doctor) or a nurse.
[1]第1実施形態
 図2は、第1実施形態の手術運用情報管理システムの要部詳細構成ブロック図である。
 データ収集装置11は、複数のカメラ11Aを備えている。
 データ収集装置11において用いられるカメラ11Aの種類としては、CCD/CMOSセンサを用いたビデオ機能を有するカメラ、インターバル撮影鏡を有するカメラ、有線あるいは無線により動画あるいは静止画の撮影が可能なカメラ等が挙げられる。
 またTOF(Time Of Flight)等による測距のために、IR(近赤外)センサを併せ持つカメラを用いることも可能である。
[1] First Embodiment FIG. 2 is a detailed block diagram of a main part of the surgical operation information management system of the first embodiment.
The data collection device 11 includes a plurality of cameras 11A.
Types of the camera 11A used in the data collection device 11 include a camera having a video function using a CCD / CMOS sensor, a camera having an interval shooting mirror, a camera capable of shooting a moving image or a still image by wire or wirelessly, and the like. Can be mentioned.
It is also possible to use a camera having an IR (near infrared) sensor for distance measurement by TOF (Time Of Flight) or the like.
 この場合において、カメラの手術室内の配置位置としては、手術室内の手術台を主として手術の様子の全景を撮影可能な位置とされる。手術室の大きさや間取り、手術台の配置、使用するカメラの撮像範囲にもよるが、手術及び各スタッフの動線の邪魔にならない位置に設置される。 In this case, the position of the camera in the operating room is such that the operating table in the operating room can be used as the main position for taking a panoramic view of the state of surgery. Although it depends on the size and floor plan of the operating room, the layout of the operating table, and the imaging range of the camera used, it will be installed in a position that does not interfere with the surgery and the flow lines of each staff member.
 具体的には、以下の位置が挙げられる。
 (1)天井部に固定設置
 (2)手術室の隅
 (3)無影灯に取り付け、あるいは、無影灯に内蔵
 (4)手術用モニタに取り付け、あるいは、手術用モニタに内蔵
Specifically, the following positions can be mentioned.
(1) Fixed installation on the ceiling (2) Corner of the operating room (3) Attached to the surgical light or built into the surgical light (4) Attached to the surgical monitor or built into the surgical monitor
 また、術場カメラ、術野カメラ、内視鏡カメラ、顕微鏡カメラ等の既存のカメラを用いるようにすることも可能である。 It is also possible to use existing cameras such as a surgical field camera, a surgical field camera, an endoscopic camera, and a microscope camera.
 さらに術前、術後のモニタリングを行い、撮像データを取得するために、手術室外に設けたカメラを用いるようにすることも可能である。
 このような手術室外のカメラの設置場所として、手術部フロアにおける手術室外の廊下、機材室等としてもよい。
 すなわち、手術室内及び手術室外の手術に関連する領域のうち、少なくとも手術室内を撮像できる場所であれば、任意に設置が可能である。
Furthermore, it is also possible to use a camera provided outside the operating room to perform preoperative and postoperative monitoring and acquire imaging data.
Such a camera outside the operating room may be installed in a corridor outside the operating room on the floor of the operating unit, an equipment room, or the like.
That is, it can be arbitrarily installed as long as it is possible to image at least the operating room in the area related to the operation in the operating room and outside the operating room.
 ところで、複数のカメラを設置した場合には、複数のカメラ同士の同期を図る必要がある。
 同期を図る方法としては、以下のような方法が挙げられる。
 (1)有線又は無線により複数のカメラに接続された制御機器による一括カメラ制御
 (2)有線又は無線により複数のカメラに接続された制御機器からの同期信号に基づくカメラ同期動作
 (3)有線又は無線により複数のカメラに接続された制御機器による各撮像データへのタイムコード設定、あるいは、カメラから入力された撮像データへのタイムコードの記録
By the way, when a plurality of cameras are installed, it is necessary to synchronize the plurality of cameras with each other.
The following methods can be mentioned as a method for synchronizing.
(1) Collective camera control by control devices connected to multiple cameras by wire or wireless (2) Camera synchronization operation based on synchronization signals from control devices connected to multiple cameras by wire or wireless (3) Wired or wireless Time code setting for each imaging data by a control device connected to multiple cameras wirelessly, or recording of the time code for imaging data input from the cameras
 カメラから取得する撮像データとしては、JPEG等の圧縮画像フォーマット、BMP等の非圧縮画像フォーマットによる静止画データ、MPEG等の動画データ、動画データから切り出した静止画データ等が用いられる。 As the imaging data acquired from the camera, compressed image formats such as PEG, still image data in uncompressed image formats such as BMP, moving image data such as MPEG, still image data cut out from the moving image data, and the like are used.
 データ処理装置12は、図2に示すように、データ処理部12Aと、状況判定部12Bと、を備えている。
 データ処理部12Aは、データ収集装置11が取得した撮像データに対応する撮像画像から物体(人物及び物)を判別し、人物の姿勢、人物及び物の位置、時刻を基本データとして取得する。
As shown in FIG. 2, the data processing device 12 includes a data processing unit 12A and a status determination unit 12B.
The data processing unit 12A discriminates an object (person and object) from the captured image corresponding to the captured data acquired by the data collection device 11, and acquires the posture of the person, the position of the person and the object, and the time as basic data.
 この場合において、人物の姿勢、人物及び物の位置の検出については、YOLO等のディープラーニングによる一般物体認識処理、OpenPose等の姿勢・骨格検知アルゴリズム等を用いて行われる。
 また時刻については、撮像データのタイムコードから取得することが可能である。
In this case, the posture of the person and the position of the person and the object are detected by using a general object recognition process by deep learning such as YOLO, a posture / skeleton detection algorithm such as OpenPose, and the like.
The time can be obtained from the time code of the imaging data.
 またデータ処理部12Aにおいて、判別する人物以外の物としては、手術室内に配置されている、あるいは、手術室内に持ちまれる医療機器、医療器械、医療材料等が挙げられる。 In the data processing unit 12A, examples of objects other than the person to be discriminated include medical devices, medical instruments, medical materials, etc. that are placed in the operating room or carried in the operating room.
 医療機器としては、モニタ、トロリー、内視鏡光源、内視鏡カメラコントロールユニット(内視鏡CCU)、内視鏡カメラヘッド(内視鏡CH)、顕微鏡、超音波診断機、血圧計、パルスオキシメータ、心電図計、温度計(直腸温度計等)、導尿カテーテル、換気マスク等が挙げられる。 Medical devices include monitors, trolleys, endoscopic light sources, endoscopic camera control units (endoscopic CCU), endoscopic camera heads (endoscopic CH), microscopes, ultrasonic diagnostic machines, sphygmomanometers, and pulses. Examples include an oximeter, an electrocardiograph, a thermometer (rectal thermometer, etc.), a urinary catheter, a ventilation mask, and the like.
 また器械類としては、鉗子、把持鉗子、持針器、鑷子、剪刀、メス、電気メス(バイポーラ、モノポーラ)、レーザメス、超音波凝固切開装置、ステープラ、トロッカ、咽頭鏡等が挙げられる。 Examples of instruments include forceps, grasping forceps, needle holders, tweezers, scissors, scalpels, electrosurgical scalpels (bipolar, monopolar), laser scalpels, ultrasonic coagulation and incision devices, staplers, troccers, throat mirrors, and the like.
 また、医療材料としては、ガーゼ、針、組織接着用シール、生理食塩水、各種輸液、輸血用血液、酸素ボンベ、笑気ガスボンベ等が挙げられる。 Examples of medical materials include gauze, needles, tissue adhesive seals, physiological saline, various infusions, blood for transfusion, oxygen cylinders, laughing gas cylinders, and the like.
 また、データ処理部12Aにおいて、検出される位置のデータとしては、物体の位置を手術室の床、壁、あるいは天井等に投影した2次元位置(例えば、(x、y))、あるいは、物体の手術室内の3次元位置(例えば、(x、y、z))が含まれる。 Further, as the data of the position detected by the data processing unit 12A, the two-dimensional position (for example, (x, y)) obtained by projecting the position of the object onto the floor, wall, ceiling, or the like of the operating room, or the object. 3D positions in the operating room (eg, (x, y, z)) are included.
 この場合において、3次元位置データの取得方法としては、複数台のカメラによりそれぞれ撮影された撮像画像についてYOLO等のディープラーニングによる一般物体認識処理により検出された物体の2次元位置の情報及び各カメラの3次元位置情報、画角等に基づいて、撮像領域に仮想的に設けた基準点の位置(水平、垂直方向及び奥行き方向に複数設けた位置)との関係から計算する方法が挙げられる。 In this case, as a method of acquiring the three-dimensional position data, information on the two-dimensional position of the object detected by the general object recognition process by deep learning such as YOLO and each camera for the captured images taken by a plurality of cameras respectively. Based on the three-dimensional position information, the angle of view, and the like, a method of calculating from the relationship with the positions of reference points virtually provided in the imaging region (positions provided in the horizontal, vertical, and depth directions) can be mentioned.
 またTOFあるいはLiDAR(Light Detection And Ranging)等の直接的な測距技術を用いるようにしても良い。 Alternatively, a direct distance measuring technique such as TOF or LiDAR (Light Detection And Ringing) may be used.
 ここで、管理サーバ14が備える判定用データベース13の構成について説明する。
 判定用データベース13は、過去の症例において取得したデータを症例毎に記憶している。
Here, the configuration of the determination database 13 included in the management server 14 will be described.
The determination database 13 stores the data acquired in the past cases for each case.
 図3は、判定用データベースの一例の説明図である。
 判定用データベース13は、複数の症例別データ21-1~21-nを備えている。
 症例別データ21-1~21-nは、同様の構成であるので、以下においては、症例別データ21-1を例として説明する。
FIG. 3 is an explanatory diagram of an example of the determination database.
The determination database 13 includes a plurality of case-specific data 21-1 to 21-n.
Since the case-specific data 21-1 to 21-n have the same configuration, the case-specific data 21-1 will be described below as an example.
 症例別データ21-1には、当該症例を識別するための症例ID(図3の例では、xxx-yyyy)が割り当てられており、役割データ31、作業内容データ32及び時刻データ33を備えている。 A case ID (xxx-yyyy in the example of FIG. 3) for identifying the case is assigned to the case-specific data 21-1, and includes role data 31, work content data 32, and time data 33. There is.
 役割データ31は、人物の当該症例の手術における役割を表すデータである。
 ここで、役割とは、当該手術に医おいて、対象の人物に割り振られた立場(期待される作業内容、または職務)を示す属性である。
 役割データ31としては、例えば、次の役割が挙げられる。
 (1)執刀医
 (2)助手
 (3)直接介助者(器械出し)
 (4)間接介助者(外回り)
 (5)麻酔科医
 (6)周麻酔期看護師
 (7)臨床工学士(Clinical Engineer)
 (8)手術室スタッフ(搬入及び搬出)
 (9)業者(清掃、消毒、補助等)
The role data 31 is data representing the role of a person in the operation of the case.
Here, the role is an attribute indicating the position (expected work content or job) assigned to the target person in the surgery.
Examples of the role data 31 include the following roles.
(1) Surgeon (2) Assistant (3) Direct caregiver (instrument delivery)
(4) Indirect caregiving (outside)
(5) Anesthesiologist (6) Peri-anesthesia nurse (7) Clinical Engineer
(8) Operating room staff (carry-in and carry-out)
(9) Contractor (cleaning, disinfection, assistance, etc.)
 作業内容データ32は、各人物の作業内容を表すデータである。
 作業内容データ32としては、例えば、次の役割が挙げられる。
The work content data 32 is data representing the work content of each person.
The work content data 32 includes, for example, the following roles.
 (1)一般作業内容
  (1-1)機材搬入
  (1-2)機材搬出
  (1-3)機材設定
  (1-4)患者受け入れ
  (1-5)麻酔導入
   (1-5-1)ルート確保
   (1-5-2)センサ装着(心電図計、血圧計、パルスオキシメータ等)
   (1-5-3)気管挿管
  (1-6)手術準備
   (1-6-1)導尿
   (1-6-2)温度計装着
   (1-6-3)体位固定
   (1-6-4)保温
   (1-6-5)器械展開
   (1-6-6)ガーゼカウント
  (1-7)記録作業
  (1-8)通話
  (1-9)患者状態確認(体温、血圧、尿、圧迫)
  (1-10)器械出し
  (1-11)直接介助支援(直接介助者との器械、医療材料の受け渡し)
  (1-12)麻酔覚醒
   (1-12-1)体位固定解除
   (1-12-2)センサ抜去
   (1-12-3)覚醒確認
   (1-12-4)気管挿管チューブの抜去
  (1-13)患者退室
   (1-13-1)ベッド移乗
  (1-14)清掃
(1) General work contents (1-1) Carrying in equipment (1-2) Carrying out equipment (1-3) Setting equipment (1-4) Accepting patients (1-5) Introducing anesthesia (1-5-1) Securing routes (1-5-2) Sensor mounting (electrocardiogram, sphygmomanometer, pulse oximeter, etc.)
(1-5-3) Tracheal intubation (1-6) Preparation for surgery (1-6-1) Urinary catheterization (1-6-2) Wearing a thermometer (1-6-3) Positioning fixed (1-6-4) ) Heat retention (1-6-5) Instrument deployment (1-6-6) Gauze count (1-7) Recording work (1-8) Call (1-9) Patient condition confirmation (body temperature, blood pressure, urine, compression)
(1-10) Delivery of instruments (1-11) Direct assistance support (delivery of instruments and medical materials with direct caregivers)
(1-12) Awakening of anesthesia (1-12-1) Release of postural fixation (1-12-2) Removal of sensor (1-12-3) Confirmation of arousal (1-12-4) Removal of tracheal intubation tube (1-12) 13) Patient leaving room (1-13-1) Bed transfer (1-14) Cleaning
 (2)手技作業内容
  (2-1)把持
  (2-2)止血
  (2-3)切除
  (2-4)切離
  (2-5)縫合
  (2-6)結索
  (2-7)吻合
(2) Procedure Work content (2-1) Grasp (2-2) Hemostasis (2-3) Resection (2-4) Separation (2-5) Suture (2-6) Rope splicing (2-7) Anastomosis
 次に状況判定部12Bの概要について説明する。
 状況判定部12Bは、データ処理部12Aにおいて取得したデータと判定用データベース13に格納された過去の症例のデータである症例別データ21-1~21-nとを比較することにより、データ処理部12Aにおいて取得したデータの各人物が該当する役割及び作業内容を推定する。この場合において、前後フレームの撮像データや複数枚の撮像データを用いて得られる時間的変化の情報に基づいて役割及び作業内容を推定するようにしてもよい。
Next, the outline of the situation determination unit 12B will be described.
The situation determination unit 12B compares the data acquired by the data processing unit 12A with the case-specific data 21-1 to 21-n, which are the data of the past cases stored in the determination database 13, and thereby the data processing unit. Each person in the data acquired in 12A estimates the corresponding role and work content. In this case, the role and the work content may be estimated based on the information of the temporal change obtained by using the imaging data of the front and rear frames and the imaging data of a plurality of images.
 状況判定部12Bにおける推定方法としては、AIを用いた機械学習を用いるように構成することも可能である。 As the estimation method in the situation determination unit 12B, it is also possible to configure to use machine learning using AI.
 また状況判定部12Bは、推定された人物の役割及び作業内容の他に、役割毎の作業開始~作業終了までの時間(例えば、縫合開始から縫合終了までの時間)、手術のキーとなるイベント(サインイン、タイムアウト、サインアウト等)に基づいて、相対的な経過時間の算出を行う。 In addition to the estimated role and work content of the person, the situation determination unit 12B includes the time from the start of work to the end of work (for example, the time from the start of suturing to the end of suturing) for each role, and key events for surgery. Calculate the relative elapsed time based on (sign-in, timeout, sign-out, etc.).
 情報提示システム15は、各種情報を提示するための処理を行う情報提示処理部15Aと、情報提示処理部15Aの制御下で各種情報を提示するディスプレイ15Bと、を備えている。 The information presentation system 15 includes an information presentation processing unit 15A that performs processing for presenting various information, and a display 15B that presents various information under the control of the information presentation processing unit 15A.
 次に第1実施形態の具体的な処理例について説明する。
 まず、判定用データベースの構築処理について説明する。
 図4は、判定用データベースの構築処理フローチャートである。
 データベース構築者は、まず、撮影準備を行う(ステップS11)。
 具体的には、対象となる手術室内に1台以上のカメラ11Aを設置する。そして設置したカメラ11Aの位置情報を取得する。
Next, a specific processing example of the first embodiment will be described.
First, the process of constructing the judgment database will be described.
FIG. 4 is a flowchart of the determination database construction process.
The database builder first prepares for shooting (step S11).
Specifically, one or more cameras 11A are installed in the target operating room. Then, the position information of the installed camera 11A is acquired.
 続いて、実際に手術を行い、手術の開始から終了まで、すなわち、術前、術中、術後にわたって撮影を行い、撮像データを管理サーバ14に収集する(ステップS12)。 Subsequently, the actual surgery is performed, imaging is performed from the start to the end of the surgery, that is, before, during, and after the surgery, and the imaging data is collected in the management server 14 (step S12).
 これとともに、データ処理装置12のデータ処理部12Aは、データ収集装置11が取得した撮像データに対応する撮像画像から一般物体認識処理により人物及び物等の物体を認識し、骨格検知アルゴリズム等により人物の姿勢、人物及び物を認識した物体認識情報を取得する。また、人物及び物の位置情報を取得する。さらに、撮像データのタイムコードから時刻情報を基本データとして取得する(ステップS13)。 At the same time, the data processing unit 12A of the data processing device 12 recognizes a person and an object such as an object from the captured image corresponding to the captured data acquired by the data collecting device 11 by general object recognition processing, and the person by a skeleton detection algorithm or the like. Acquires object recognition information that recognizes the posture, person, and object. In addition, the position information of a person and an object is acquired. Further, time information is acquired as basic data from the time code of the imaging data (step S13).
 具体的には、各カメラ11Aが出力した撮像データのフレーム毎に画面内の人物に対する骨格認識結果を取得し、カメラ11A同士の撮像位置の関係から手術室内に設定した3次元座標系における人物の位置、身体の向き及び手の位置を算出する。 Specifically, the skeleton recognition result for a person on the screen is acquired for each frame of the imaging data output by each camera 11A, and the person in the three-dimensional coordinate system set in the operating room is obtained from the relationship between the imaging positions of the cameras 11A. Calculate the position, body orientation and hand position.
 例えば、18個以上の関節を骨格として検出し、骨格を構成する関節の画面内の座標を取得して、カメラ11A同士の撮像位置の関係から手術室内に設定した3次元座標系における人物の位置、身体の向き及び手の位置を算出する。 For example, 18 or more joints are detected as a skeleton, the coordinates of the joints constituting the skeleton in the screen are acquired, and the position of a person in the three-dimensional coordinate system set in the operating room from the relationship between the imaging positions of the cameras 11A. , Calculate the orientation of the body and the position of the hand.
 また、各カメラのフレーム毎に画面内の物体(人物及び物)に対する物体認識結果を取得し、カメラ11A同士の撮像位置の関係から手術室内に設定した3次元座標系における物体の位置を算出する。 In addition, the object recognition result for the object (person and object) in the screen is acquired for each frame of each camera, and the position of the object in the three-dimensional coordinate system set in the operating room is calculated from the relationship between the imaging positions of the cameras 11A. ..
 例えば、各物体に対し、人物のタグあるいは物(人物以外の物体:例えば、医療器具種)のタグを付与し、物体位置(物体の画面内の座標)を取得して、カメラ11A同士の撮像位置の関係から手術室内に設定した3次元座標系における物体の位置を算出する。 For example, each object is tagged with a person or an object (an object other than the person: for example, a medical device type), the position of the object (coordinates in the screen of the object) is acquired, and images are taken between the cameras 11A. The position of the object in the three-dimensional coordinate system set in the operating room is calculated from the positional relationship.
 さらにデータ処理装置12の状況判定部12Bは、撮像データのフレーム間で上述した処理による人物及び物の向き並びに位置情報に基づいて、各人物と物体の関係を設定する。
 例えば、以下のように設定する。
 (1)人物と物とが同様な位置関係を保ちつつ、同じ移動量である場合:人物が物を移動させていると設定。
 (2)物が同じ位置で人物が物の方向を向いている:人物が物に関与している(例えば、操作を行っている)と設定する。
 (3)人物の手に物が握られている:人物が物を保持していると設定する。
 (4)人物が物体の上で横になっている:患者がベッドあるいは手術台に横たわっている(寝ている)と設定する。
 (5)人物が患者の方向を向いて、人物の手が患者と同じ位置にある:人物が患者に関わっていると設定する。
Further, the status determination unit 12B of the data processing device 12 sets the relationship between each person and the object between frames of the imaged data based on the orientation and position information of the person and the object by the above-described processing.
For example, set as follows.
(1) When the person and the object maintain the same positional relationship and have the same amount of movement: Set that the person is moving the object.
(2) The object is in the same position and the person is facing the object: Set that the person is involved in the object (for example, performing an operation).
(3) An object is held in the person's hand: Set that the person is holding the object.
(4) Person lying on an object: Set the patient to be lying (sleeping) on a bed or operating table.
(5) The person is facing the patient and the person's hand is in the same position as the patient: Set that the person is involved with the patient.
 また、人物と物の位置情報に基づいて、人物及び物が位置しているエリア(領域)を設定する。
 具体的には、患者の周辺を清潔域とし、物として検出された滅菌不織布の上を清潔域とし、その他の領域を不潔域と設定する。
In addition, the area (area) where the person and the object are located is set based on the position information of the person and the object.
Specifically, the area around the patient is set as a clean area, the area above the sterilized non-woven fabric detected as an object is set as a clean area, and the other areas are set as a dirty area.
 続いて、データベース構築者は、有識者と協働して、取得した物体認識情報、位置情報及び時刻情報を参照し、タグ付け及び修正を行う(ステップS14)。
 より具体的には、撮影した動画像に基づいて、各スタッフ(医師、看護士等)の作業内、各スタッフの使用器具、各スタッフの役割、清潔域、不潔域などの領域を、有識者によるタグ付けを行う。
Subsequently, the database builder collaborates with an expert to refer to the acquired object recognition information, position information, and time information, and perform tagging and correction (step S14).
More specifically, based on the captured moving images, the work of each staff member (doctor, nurse, etc.), the equipment used by each staff member, the role of each staff member, the cleanliness area, the unclean area, etc. Tag.
 例えば、以下のような状況(1)~(6)が検出された場合には、人物Aは、「気管挿管を行っている麻酔医」であるとのタグ付けがされる。
 (1)人物Aの位置:患者の頭側
 (2)物の位置:患者の頭側
 (3)エリア:不潔域
 (4)人物の手:物を保持
 (5)物:咽頭鏡
 (6)人物Aの姿勢:患者側を向き、前屈み
For example, when the following situations (1) to (6) are detected, the person A is tagged as "an anesthesiologist performing tracheal intubation".
(1) Position of person A: Head side of patient (2) Position of object: Head side of patient (3) Area: Dirty area (4) Hand of person: Hold object (5) Object: Pharyngeal mirror (6) Posture of person A: facing the patient, leaning forward
 さらに、データ処理部12Aにおいて自動的に取得した情報が誤情報である場合には、当該誤情報の修正を行う。 Further, if the information automatically acquired by the data processing unit 12A is erroneous information, the erroneous information is corrected.
 この場合において、さらに補足データとしてセンサによる人物、物の位置情報あるいは、マイクによる音声情報等を追加することにより、後に当該データを参照する際の参考とする。 In this case, by adding the position information of a person or object by the sensor or the voice information by the microphone as supplementary data, it will be used as a reference when referring to the data later.
 同様にして、複数の手術に対して、上述のステップS11~ステップS14の処理を行い、診療科、手技毎にデータセットを構築し、判定用データベース13とする(ステップS15)。 Similarly, for a plurality of operations, the processes of steps S11 to S14 described above are performed, a data set is constructed for each medical department and procedure, and the determination database 13 is used (step S15).
 次に上記判定用データベース13を用いて人物の役割(ロール)を推定する処理について説明する。
 図5は、人物の役割(ロール)を推定する処理の概要処理フローチャートである。
 データ処理作業者は、まず、撮影準備を行う(ステップS21)。
 具体的には、対象となる手術室内に1台以上のカメラ11Aを設置する。そして設置したカメラ11Aの位置情報を取得する。
 あるいは、すでにカメラ11Aが設置された手術室について、取得済みの位置情報を取得し、更新する。
Next, a process of estimating the role of a person using the determination database 13 will be described.
FIG. 5 is an outline processing flowchart of the process of estimating the role of a person.
The data processing worker first prepares for shooting (step S21).
Specifically, one or more cameras 11A are installed in the target operating room. Then, the position information of the installed camera 11A is acquired.
Alternatively, the acquired position information is acquired and updated for the operating room in which the camera 11A is already installed.
 続いて、手術の開始から終了まで、すなわち、術前、術中、術後にわたって撮影を行い、撮像データを管理サーバ14に収集する(ステップS22)。 Subsequently, imaging is performed from the start to the end of the surgery, that is, before, during, and after the surgery, and the imaging data is collected in the management server 14 (step S22).
 これとともに、データ処理装置12のデータ処理部12Aは、データ収集装置11が取得した撮像データに対応する撮像画像から一般物体認識処理により人物及び物等の物体を認識し、骨格検知アルゴリズム等により人物の姿勢、人物及び物を認識した物体認識情報を取得する。また、人物及び物の位置情報を取得する。さらに、撮像データのタイムコードから時刻情報を基本データとして取得する(ステップS23)。 At the same time, the data processing unit 12A of the data processing device 12 recognizes a person and an object such as an object from the captured image corresponding to the captured data acquired by the data collecting device 11 by general object recognition processing, and the person by a skeleton detection algorithm or the like. Acquires object recognition information that recognizes the posture, person, and object. In addition, the position information of a person and an object is acquired. Further, time information is acquired as basic data from the time code of the imaging data (step S23).
 図6は、人物の役割(ロール)を推定する処理の要部詳細処理フローチャートである。
 具体的には、データ処理部12Aは、各カメラ11Aが出力した撮像データのフレーム毎に画面内の人物に対する骨格認識を行う(ステップS31)。
 続いてデータ処理部12Aは、骨格認識結果から関節を抽出する(ステップS32)。
 次にデータ処理部12Aは、関節位置を抽出し、カメラ11A同士の撮像位置の関係から手術室内に設定した3次元座標系における人物の位置、身体の向き及び手の位置を算出する(ステップS33)。
FIG. 6 is a detailed processing flowchart of a main part of the processing for estimating the role of a person.
Specifically, the data processing unit 12A performs skeleton recognition for a person on the screen for each frame of the imaging data output by each camera 11A (step S31).
Subsequently, the data processing unit 12A extracts the joint from the skeleton recognition result (step S32).
Next, the data processing unit 12A extracts the joint position and calculates the position of the person, the orientation of the body, and the position of the hand in the three-dimensional coordinate system set in the operating room from the relationship between the imaging positions of the cameras 11A (step S33). ).
 例えば、データ処理部12Aは、骨格検出結果から18個以上の関節を抽出し、骨格を構成する関節の画面内の座標を取得して、カメラ11A同士の撮像位置の関係から手術室内に設定した3次元座標系における人物の位置、身体の向き及び手の位置を算出する(ステップS34)。 For example, the data processing unit 12A extracts 18 or more joints from the skeleton detection result, acquires the coordinates of the joints constituting the skeleton in the screen, and sets them in the operating room based on the relationship between the imaging positions of the cameras 11A. The position of the person, the orientation of the body, and the position of the hand in the three-dimensional coordinate system are calculated (step S34).
 これらのステップS31からステップS35の処理と並行して、データ処理部12Aは、各カメラのフレーム毎に画面内の人物以外の物に対する物体認識する(ステップS35)。 In parallel with the processing of steps S31 to S35, the data processing unit 12A recognizes an object other than a person on the screen for each frame of each camera (step S35).
 例えば、各物体に対し、人物のタグあるいは物(人物以外の物体:例えば、医療器具種)のタグを付与し、物体位置(物体の画面内の座標)を取得して、カメラ11A同士の撮像位置の関係から手術室内に設定した3次元座標系における物の位置を算出する(ステップS36)。
 そして、状況判定部12Bは、判定用データベース13を参照して、各物についてタグ付け(物の種類:例えば、医療器具の種類の設定)を行う(ステップS37)。
For example, each object is tagged with a person or an object (an object other than the person: for example, a medical device type), the position of the object (coordinates in the screen of the object) is acquired, and images are taken between the cameras 11A. The position of the object in the three-dimensional coordinate system set in the operating room is calculated from the positional relationship (step S36).
Then, the situation determination unit 12B refers to the determination database 13 and tags each object (type of object: for example, setting of type of medical device) (step S37).
 さらにデータ処理装置12の状況判定部12Bは、撮像データのフレーム間で上述した処理による人物及び物の向き及び位置情報に基づいて、手術室内での人物及び物の位置を三次元座標系で算出する(ステップS38)。 Further, the status determination unit 12B of the data processing device 12 calculates the positions of the person and the object in the operating room in a three-dimensional coordinate system based on the orientation and position information of the person and the object by the above-mentioned processing between the frames of the imaging data. (Step S38).
 次にデータ処理装置12の状況判定部12Bは、人物と対応する物の位置関係を設定する(ステップS39)。
 さらに状況判定部12Bは、人物及び物の存在エリア(清潔域あるいは不潔域)を設定する(ステップS40)。
Next, the status determination unit 12B of the data processing device 12 sets the positional relationship between the person and the corresponding object (step S39).
Further, the situation determination unit 12B sets an area (clean area or unclean area) where people and objects exist (step S40).
 続いて、データ処理装置12の状況判定部12Bは、図5に示すように、ステップS23の処理において取得した物体認識情報、位置情報及び時刻情報並びに撮像データに対応する動画像を、判定用データベース13に格納されているデータセットと比較し、AIによるマッチングを行う(ステップS24)。 Subsequently, as shown in FIG. 5, the status determination unit 12B of the data processing device 12 uses the object recognition information, the position information, the time information, and the moving image corresponding to the captured data acquired in the process of step S23 as a determination database. It is compared with the data set stored in 13 and matched by AI (step S24).
 そして、状況判定部12Bは、物体認識情報に含まれる人物毎にマッチング率の高い役割を当該人物の役割(執刀医、直接介助者、麻酔医等)としてタグ付けする(ステップS25)。 Then, the situation determination unit 12B tags each person included in the object recognition information with a role having a high matching rate as the role of the person (surgeon, direct caregiving, anesthesiologist, etc.) (step S25).
 ここで、役割の推定について具体的に説明する。
 図7は、役割推定の具体例の説明図である。
 データ処理装置12のデータ処理部12Aは、例えば、図7の左側に示すようにフレーム毎に人物の姿勢、位置、物としての機材の種類、位置、及び撮影時刻を基本データD0として取得している。
Here, the estimation of the role will be specifically described.
FIG. 7 is an explanatory diagram of a specific example of role estimation.
For example, as shown on the left side of FIG. 7, the data processing unit 12A of the data processing device 12 acquires the posture and position of a person, the type and position of equipment as an object, and the shooting time as basic data D0 for each frame. There is.
 例えば、図7のフレームデータNo.1の場合、人物Aの姿勢(関節1x、y、…)、人物Aの位置(xx,yy)を格納し、機材Mの種類A、品名B及び機材Mの位置(xw,yy)を格納している。 For example, the frame data No. of FIG. 7 In the case of 1, the posture of the person A (joints 1x, y, ...) And the position of the person A (xx, yy) are stored, and the type A of the equipment M, the product name B, and the position of the equipment M (xw, yy) are stored. doing.
 一方、判定用データベース13は、過去の症例について、人物毎の役割、作業内容、当該人物の姿勢、位置、使用機材、機材位置等を判定用データDRとして格納している。
 例えば、図7の右側に示すように、役割:Xについては、二つのデータを格納しており、一方のデータは、作業内容は、Fであり、作業者の姿勢(関節1x、y、…)、作業者の位置(ww,yy)を格納し、使用機材の種類C、品名D及び使用機材の位置(xz,yy)を格納している。また、他方のデータは、作業内容は、Gであり、作業者の姿勢(関節1x、y、…)、作業者の位置(xx,yy)を格納し、使用機材の種類C、品名D及び使用機材の位置(xw,yy)を格納している。
On the other hand, the determination database 13 stores the role, work content, posture, position, equipment used, equipment position, etc. of each person for the past cases as the determination data DR.
For example, as shown on the right side of FIG. 7, for the role: X, two data are stored, one of the data has a work content of F, and the posture of the worker (joints 1x, y, ... ), The position of the worker (ww, yy) is stored, and the type C of the equipment used, the product name D, and the position of the equipment used (xz, yy) are stored. In the other data, the work content is G, and the posture of the worker (joints 1x, y, ...) And the position of the worker (xx, yy) are stored, and the type C of the equipment used, the product name D, and the product name D are stored. The position (xw, yy) of the equipment used is stored.
 これらの結果、状況判定部12Bは、データ処理部12Aが取得したデータ及び判定用データベース13の格納しているデータを比較し、データ処理部12Aが取得したデータに対応する人物の作業内容及び役割を推定し、各作業に要した時間(作業時間)及びイベントに対する相対時間を算出する。 As a result, the situation determination unit 12B compares the data acquired by the data processing unit 12A with the data stored in the determination database 13, and the work content and role of the person corresponding to the data acquired by the data processing unit 12A. Is estimated, and the time required for each work (working time) and the relative time for the event are calculated.
 より具体的には、人物Aの姿勢が役割Xの作業Gにおける姿勢とのマッチング率が高く、人物Aの位置が役割Xの作業Gにおける位置とのマッチング率が高く、機材Aが役割Xの作業Gにおける使用機材とのマッチング率が高く、人物Aと機材Aとの距離が役割Xの作業Gにおける作業者と使用機材との距離のマッチング率が高い場合には、状況判定部12Bは、人物Aは、作業Gを行っている役割Xと推定することとなる。 More specifically, the posture of the person A has a high matching rate with the posture of the role X in the work G, the position of the person A has a high matching rate with the position of the role X in the work G, and the equipment A has a high matching rate of the role X. When the matching rate with the equipment used in the work G is high and the distance between the person A and the equipment A is high and the matching rate between the worker and the equipment used in the work G of the role X is high, the situation determination unit 12B determines. The person A is presumed to be the role X performing the work G.
 そして、役割推定処理により得られる情報に基づけば、対象となる手術(症例)における作業時間、作業者(作業者数)、手術室内外で必要とされる作業を容易に把握することができる。 Then, based on the information obtained by the role estimation process, it is possible to easily grasp the working time, the number of workers (number of workers) in the target surgery (case), and the work required inside and outside the surgery room.
 図8、作業内容の可視化例の説明図(術前、術中)である。
 図9は、作業内容の可視化例の説明図(術後)である。
FIG. 8 is an explanatory diagram (preoperative and intraoperative) of a visualization example of work contents.
FIG. 9 is an explanatory diagram (postoperative) of a visualization example of the work content.
 図8及び図9に示すように、所定の症例において、機材搬入、機材設定、麻酔導入、サインイン(手術室入室時の安全確認)、手術準備、タイムアウト(手術直前の安全確認)、手技、間接介助作業、閉創、搬出、麻酔覚醒、サインアウト(手術終了時の安全確認)、患者退室、清掃等の一連の作業を、自動的に人物の役割に関連づけて時系列で提示することができる。なお、時刻t1は、術前から術中への移行時刻として認識された時刻、時刻t2は、術中から術後への移行時刻として認識された時刻である。 As shown in FIGS. 8 and 9, in a predetermined case, equipment delivery, equipment setting, anesthesia introduction, sign-in (safety confirmation when entering the operating room), surgical preparation, time-out (safety confirmation immediately before surgery), procedure, A series of tasks such as indirect caregiving, closure, removal, awakening of anesthesia, sign-out (safety confirmation at the end of surgery), patient leaving, cleaning, etc. can be automatically presented in chronological order in relation to the role of the person. .. The time t1 is the time recognized as the transition time from preoperative to intraoperative, and the time t2 is the time recognized as the transition time from intraoperative to postoperative.
 したがって、管理者は、容易に作業内容及び作業の進行状態を把握することができるので、スケジュールの確認、サポートの要否などを容易に行え、ひいては、手術関連業務の効率化に資することができる。 Therefore, since the administrator can easily grasp the work contents and the progress status of the work, it is possible to easily confirm the schedule, the necessity of support, etc., which in turn contributes to the efficiency of the surgery-related work. ..
 したがって、本第1実施形態によれば、患者に関わる作業時間であるリアルタイムの進捗状況を把握できるので、当該手術あるいは手術室のスケジュールへの影響、サポートが必要否かの予測を行うことができる。 Therefore, according to the first embodiment, since the progress status in real time, which is the working time related to the patient, can be grasped, it is possible to predict the influence on the schedule of the surgery or the operating room and whether or not support is necessary. ..
 また類似手技におけるスケジュールを予測することが可能となる。
 また同様の症例におけるチーム編成毎の違いを容易に把握でき、より最適な作業スケジュールを構築することができる。
It also makes it possible to predict the schedule for similar procedures.
In addition, the difference between team formations in similar cases can be easily grasped, and a more optimal work schedule can be constructed.
 さらに作業シーケンスの最適化を図ったり、作業毎の必要人員を容易に把握したりできる。
 具体的には、過去のデータ(過去の同様の症例データ)と比較を行うことで、効率改善のための以下の(1)~(7)の検討、予測、提案を行える。
 (1)ワークフローの検討、
 (2)作業毎の最適な人員割当の検討、
 (3)リソースの配置検討、
 (4)チーム編成検討、
 (5)手技を含む手術全体のスケジュール予測、
 (6)手術の進捗状況に合わせたアラートの生成:人員のサポート、スケジュール調整、次のワークフローの提示
 (7)症例に応じた手技の事前予測、提案
Furthermore, it is possible to optimize the work sequence and easily grasp the required number of personnel for each work.
Specifically, by comparing with past data (similar case data in the past), the following (1) to (7) can be examined, predicted, and proposed for improving efficiency.
(1) Examination of workflow,
(2) Examination of optimal personnel allocation for each work,
(3) Examining resource allocation,
(4) Examining team formation,
(5) Predicting the schedule of the entire surgery including the procedure,
(6) Generation of alerts according to the progress of surgery: Supporting personnel, adjusting schedules, presenting the next workflow (7) Predicting and proposing procedures according to cases
[2]第2実施形態
 図10は、第2実施形態の手術運用情報管理システムの要部詳細構成ブロック図である。
 図10の第2実施形態が、図2の第1実施形態と異なる点は、患者の位置及び姿勢を推定する患者推定処理部12Cと、患者推定処理部12Cの推定結果(位置及び姿勢)と院内システムから取得した術式情報(手術内容の情報)に基づいて重み付け係数を生成し、患者推定処理部12Cの推定結果に付加して出力する重み付け係数付加部12Dと、を備えた点である。
[2] Second Embodiment FIG. 10 is a detailed configuration block diagram of a main part of the surgical operation information management system of the second embodiment.
The second embodiment of FIG. 10 differs from the first embodiment of FIG. 2 in that the patient estimation processing unit 12C that estimates the position and posture of the patient and the estimation result (position and posture) of the patient estimation processing unit 12C. It is a point provided with a weighting coefficient addition unit 12D that generates a weighting coefficient based on the surgical procedure information (information on the operation content) acquired from the in-hospital system, adds it to the estimation result of the patient estimation processing unit 12C, and outputs it. ..
 ところで、術式毎に患者の頭側等の部位からの作業内容推定対象の人物及び物のベクトル(方向、距離)はほぼ定まる。
 たとえば、患者の頭側に向かって、数十cm以内に咽頭鏡を持っている人物がいた場合には、当該人物は、気管挿管を行う麻酔医の可能性が高い。
By the way, the vector (direction, distance) of the person and the object for which the work content is estimated from the part such as the head side of the patient is almost determined for each surgical procedure.
For example, if there is a person holding a laryngoscope within a few tens of centimeters toward the patient's head, that person is likely to be an anesthesiologist performing a tracheal intubation.
 したがって、患者の頭側に向かって、数十cm以内に咽頭鏡を持っている人物については、人物の役割として麻酔医の重み付け係数を大きくするとともに、作業内容として気管挿管の重み付け係数を大きくする。
 この結果、人物の役割及び作業内容の推定における推定が正しい確率を高くすることが可能となる。
Therefore, for a person who has a laryngoscope within a few tens of centimeters toward the patient's head, the weighting factor of the anesthesiologist is increased as the role of the person, and the weighting coefficient of tracheal intubation is increased as the work content. ..
As a result, it is possible to increase the probability that the estimation in estimating the role and work content of the person is correct.
 すなわち、患者推定処理部12Cからの患者の位置、姿勢及び院内システム20から取得した術式情報を重み付け係数付加部12Dに入力し、重み付け係数が付加された患者推定処理部12Cの推定結果を状況判定部12Bに入力することで、患者の各部位からの人物あるいは物へのベクトルを人物の役割、作業内容の推定に用いることができる。 That is, the position and posture of the patient from the patient estimation processing unit 12C and the surgical procedure information acquired from the in-hospital system 20 are input to the weighting coefficient addition unit 12D, and the estimation result of the patient estimation processing unit 12C to which the weighting coefficient is added is displayed. By inputting to the determination unit 12B, the vector from each part of the patient to the person or the object can be used for estimating the role of the person and the work content.
 ここで、本第2実施形態のデータ処理部12Aが、図7に示した基本データD0に追加して取得するデータについて説明する。
 図11は、第2実施形態において、追加されるデータの一例の説明図である。
 本第2実施形態において基本データD0に追加して取得するデータは、患者と人物との距離データDA1、患者と機材との距離データDA2、患者から人物に対するベクトルデータDA3等が挙げられる。
Here, the data acquired by the data processing unit 12A of the second embodiment in addition to the basic data D0 shown in FIG. 7 will be described.
FIG. 11 is an explanatory diagram of an example of data to be added in the second embodiment.
Examples of the data acquired in addition to the basic data D0 in the second embodiment include the distance data DA1 between the patient and the person, the distance data DA2 between the patient and the equipment, and the vector data DA3 from the patient to the person.
 より具体的には、図11の例では、距離データDA1は、患者と人物Aとの距離であり、xxxcmが格納されている。また距離データDA2は、患者と機材Cとの距離であり、yyycmが格納されている。
 さらにベクトルデータDA3は、患者から人物Cに対するベクトルであり、向きはx°、距離はxxcmとなっている。
More specifically, in the example of FIG. 11, the distance data DA1 is the distance between the patient and the person A, and xxxcm is stored. Further, the distance data DA2 is the distance between the patient and the equipment C, and yycm is stored.
Further, the vector data DA3 is a vector from the patient to the person C, and the direction is xx ° and the distance is xx cm.
 図12は、第2実施形態の判定用データベースの構成例の説明図である。
 このときの判定用データベース13は、過去の症例についての判定用データDR-Aとして、術式データDR-A1、執刀医データDR-A2及びワークフローデータDR-A3を格納している。
FIG. 12 is an explanatory diagram of a configuration example of the determination database of the second embodiment.
The determination database 13 at this time stores the surgical procedure data DR-A1, the surgeon data DR-A2, and the workflow data DR-A3 as the determination data DR-A for past cases.
 より具体的には、図12の例では、術式データDR-A1は、術式xxxxが格納され、執刀医データDR-A2は、執刀医yyyyが格納されている。
 また、ワークフローデータDR-A3としては、図12の例では、術前のワークフローデータDR-A31及び術中のワークフローデータDR-A32を格納している。
More specifically, in the example of FIG. 12, the surgical procedure data DR-A1 stores the surgical procedure xxx, and the surgeon data DR-A2 stores the surgeon yyyy.
Further, as the workflow data DR-A3, in the example of FIG. 12, the preoperative workflow data DR-A31 and the intraoperative workflow data DR-A32 are stored.
 ワークフローデータDR-A31及び術中のワークフローデータDR-A32は、それぞれ、人物、物の配置のパターン(例えば、パターンA)に対し、一又は複数の詳細データを格納している。 The workflow data DR-A31 and the intraoperative workflow data DR-A32 store one or more detailed data for the pattern of arrangement of people and objects (for example, pattern A), respectively.
 例えば、人物、物の配置のパターンAについては、二つの詳細データが格納されており、人物の役割、人物の作業内容、人物の姿勢、人物の位置、服の色、使用機材(種類及び品名)、機材位置を格納している。さらに詳細データは、患者からの距離、あるいは、患者からのベクトルを格納している。
 以上の説明のように、第2実施形態によれば、術式毎にほぼ定まる患者の頭側等の部位からの作業内容推定対象の人物及び物のベクトル(方向、距離)に基づいて、重み付けを行って人物の役割を推定することで、第1実施形態の効果に加えて、人物の役割及び作業内容の推定の正確性を向上させることができる。
For example, for pattern A of arrangement of a person and an object, two detailed data are stored, and the role of the person, the work content of the person, the posture of the person, the position of the person, the color of clothes, the equipment used (type and product name). ), The equipment position is stored. Further detailed data stores the distance from the patient or the vector from the patient.
As described above, according to the second embodiment, weighting is performed based on the vector (direction, distance) of the person and the object whose work content is to be estimated from the part such as the head side of the patient, which is almost determined for each surgical procedure. By performing the above to estimate the role of the person, in addition to the effect of the first embodiment, the accuracy of estimating the role of the person and the work content can be improved.
[3]第3実施形態
 図13は、第3実施形態の手術運用情報管理システムの要部詳細構成ブロック図である。
 図13の第3実施形態が、図2の第1実施形態と異なる点は、院内システム20を構成している手術室システム20Aからの術式情報(手術内容の情報)に基づいて重み付け係数を生成して出力する重み付け係数付加部12Eを備えた点である。
[3] Third Embodiment FIG. 13 is a detailed block diagram of a main part of the surgical operation information management system of the third embodiment.
The third embodiment of FIG. 13 differs from the first embodiment of FIG. 2 in that the weighting coefficient is set based on the surgical procedure information (information on the surgical content) from the operating room system 20A constituting the in-hospital system 20. This is a point provided with a weighting coefficient addition unit 12E that is generated and output.
 ところで、術式毎にワークフロー(術式手順)及び作業時間はほぼ定まる。
 たとえば、ある作業が機材搬入開始後30分後であって、タイムアウト(手術直前の安全確認)の10分前に行われていた場合、当該作業が術中の作業である可能性は低い。
By the way, the workflow (procedure) and working time are almost fixed for each surgical procedure.
For example, if a certain work is performed 30 minutes after the start of carrying in the equipment and 10 minutes before the time-out (safety confirmation immediately before the operation), it is unlikely that the work is an intraoperative work.
 そこで、手術室システム20Aからの術式情報を重み付け係数付加部12Eに入力することで、作業内容の重み付けを変えて状況判定部12Bに出力することで、状況判定部12Bは、各作業のイベント(上述の例では、タイムアウト)からの相対時間を作業内容の推定に用いることができ、人物の役割及び作業内容の推定における推定が正しい確率を高くすることが可能となる。 Therefore, by inputting the surgical procedure information from the operating room system 20A into the weighting coefficient addition unit 12E, the weighting of the work content is changed and output to the situation determination unit 12B, so that the situation determination unit 12B can perform the event of each work. The relative time from (timeout in the above example) can be used for estimating the work content, and it is possible to increase the probability that the estimation in estimating the role of the person and the work content is correct.
 また、各執刀医の術式毎に、人物及び物の配置、使用機材及びワークフロー(術式手順)はほぼ定まる。 In addition, the arrangement of people and objects, the equipment used, and the workflow (procedure procedure) are almost determined for each surgical procedure of each surgeon.
 そこで、手術室システム20Aからの執刀医に対応する術式情報を重み付け係数付加部12Eに入力し、作業内容の重み付けを変えて状況判定部12Bに出力することで、状況判定部12Bは、執刀医の術式を作業内容の推定に用いることができ、人物の役割及び作業内容の推定における推定が正しい確率を高くすることが可能となる。 Therefore, the surgical procedure information corresponding to the surgeon from the operating room system 20A is input to the weighting coefficient addition unit 12E, the weighting of the work content is changed, and the operation information is output to the situation determination unit 12B. The medical technique can be used to estimate the work content, and it is possible to increase the probability that the estimation in estimating the role of the person and the work content is correct.
 図14は、第3実施形態~第5実施形態のデータ処理部の取得データの一例の説明図である。
 データ処理装置12のデータ処理部12Aは、例えば、図14に示すようにフレーム毎に人物の姿勢、位置、物としての機材の種類、位置、及び撮影時刻を含む基本データD0(図7参照)の他、術式データD1、執刀医データD2、機材情報データD3及び色情報データD4を取得している。
 この場合において、第3実施形態においては、データ処理部12Aは、少なくとも基本データD0、術式データD1(図12の例においては、術式xxxx)及び執刀医データD2(図12の例においては、執刀医yyyy)を取得していればよい。
FIG. 14 is an explanatory diagram of an example of acquired data of the data processing unit of the third to fifth embodiments.
As shown in FIG. 14, the data processing unit 12A of the data processing device 12 includes basic data D0 (see FIG. 7) including the posture and position of a person, the type and position of equipment as an object, and the shooting time for each frame, for example, as shown in FIG. In addition, surgical procedure data D1, surgeon data D2, equipment information data D3, and color information data D4 have been acquired.
In this case, in the third embodiment, the data processing unit 12A has at least basic data D0, surgical procedure data D1 (surgical procedure xxx in the example of FIG. 12) and surgeon data D2 (in the example of FIG. 12). , Surgeon yyy).
 これらの結果、本第3実施形態によれば、第2実施形態と同様に、第1実施形態の効果に加えて、人物の役割及び作業内容の推定の正確性を向上させることができる。 As a result, according to the third embodiment, as in the second embodiment, in addition to the effect of the first embodiment, the accuracy of estimating the role of the person and the work content can be improved.
[4]第4実施形態
 図15は、第4実施形態の手術運用情報管理システムの要部詳細構成ブロック図である。
 図15の第4実施形態が、図2の第1実施形態と異なる点は、院内システム20を構成している手術室システム20Aからの術式情報(手術内容の情報)及び手術室内機器41からの機材情報に基づいて重み付け係数を生成して出力する重み付け係数付加部12Fを備えた点である。
[4] Fourth Embodiment FIG. 15 is a detailed configuration block diagram of a main part of the surgical operation information management system of the fourth embodiment.
The fourth embodiment of FIG. 15 differs from the first embodiment of FIG. 2 from the surgical procedure information (surgery content information) from the operating room system 20A constituting the in-hospital system 20 and the operating room device 41. It is a point provided with a weighting coefficient addition unit 12F that generates and outputs a weighting coefficient based on the equipment information of the above.
 本第4実施形態においては、データ処理部12Aは、図14に示した、取得データD0~D4のうち、少なくとも基本データD0及び機材情報データD3(図12の例においては、機材Aに関する機能(図14の例では、機能A及びB)、稼働開始時刻(図14の例では、時刻xx:ya)、稼働終了時刻(図14の例では、時刻xx:yb)、機材Bに関する機能(図14の例では、機能C)、稼働開始時刻及び稼働終了時刻を取得していればよい。 In the fourth embodiment, the data processing unit 12A has at least basic data D0 and equipment information data D3 among the acquired data D0 to D4 shown in FIG. 14 (in the example of FIG. 12, the function related to the equipment A (in the example of FIG. 12) In the example of FIG. 14, functions A and B), operation start time (time xx: ya in the example of FIG. 14), operation end time (time xx: yb in the example of FIG. 14), and functions related to equipment B (FIG. 14). In the example of 14, it suffices to acquire the function C), the operation start time, and the operation end time.
 ところで、術式毎に使用機材と当該使用機材の稼働時間はほぼ定まる。
 たとえば、モニタに対してボタン操作を行っている人物がいた場合、操作時刻(あるいは操作開始から操作終了までの時間)がモニタの稼働開始時刻と近ければ、当該操作は搬入時のモニタ設定作業である。また、操作時刻が稼働終了時刻と近ければ、搬出時の終了作業の可能性が高い。
By the way, the equipment used and the operating time of the equipment used are almost determined for each surgical procedure.
For example, if there is a person who is operating a button on the monitor and the operation time (or the time from the start of the operation to the end of the operation) is close to the operation start time of the monitor, the operation is the monitor setting work at the time of loading. be. Further, if the operation time is close to the operation end time, there is a high possibility of the end work at the time of carrying out.
 そこで、手術室システム20Aからの術式情報及び手術室内機器41からの機材情報を重み付け係数付加部12Fに入力することで、当該手術室内機器41の稼働状況の重み付けを変えて状況判定部12Bに出力する。 Therefore, by inputting the surgical procedure information from the operating room system 20A and the equipment information from the operating room device 41 into the weighting coefficient addition unit 12F, the weighting of the operating status of the operating room device 41 is changed to the situation determination unit 12B. Output.
 ここで、機材情報としては手術室内機器41からの動作情報、稼働時間、設定情報等が挙げられる。
 これらの結果、状況判定部12Bは、手術室内機器41の稼働状況を作業内容の推定に用いることができ、人物の役割及び作業内容の推定における推定が正しい確率を高くすることが可能となる。
Here, as the equipment information, operation information, operating time, setting information, etc. from the operating room equipment 41 can be mentioned.
As a result, the situation determination unit 12B can use the operating status of the operating room device 41 for estimating the work content, and can increase the probability that the estimation in estimating the role of the person and the work content is correct.
 これらの結果、本第4実施形態によれば、第2実施形態及び第3実施形態と同様に、第1実施形態の効果に加えて、人物の役割及び作業内容の推定の正確性を向上させることができる。 As a result, according to the fourth embodiment, as in the second and third embodiments, in addition to the effects of the first embodiment, the accuracy of estimating the role of the person and the work content is improved. be able to.
[5]第5実施形態
 図16は、第5実施形態の手術運用情報管理システムの要部詳細構成ブロック図である。
 図16の第5実施形態が、図2の第1実施形態と異なる点は、撮像データに基づいて色情報を出力する色識別処理部12Gと、院内システム20を構成している手術室システム20Aからの術式情報(手術内容の情報)及び色識別処理部12Gからの色情報に基づいて重み付け係数を生成して出力する重み付け係数付加部12Hを備えた点である。
 この場合において、色識別処理部12Gは、Open CV(Open Source Computer Vision Library)等を用いた色検出を行うことにより、画像内の色を当該色が検出された位置とともに取得し、色情報として出力する。色を表すデータのフォーマットとしては、RGB、YUB、YPbPr(YcbCr)、HSV等が挙げられる。
[5] Fifth Embodiment FIG. 16 is a detailed block diagram of a main part of the surgical operation information management system of the fifth embodiment.
The fifth embodiment of FIG. 16 differs from the first embodiment of FIG. 2 in that the color identification processing unit 12G that outputs color information based on the imaging data and the operating room system 20A that constitutes the in-hospital system 20. It is a point provided with a weighting coefficient addition unit 12H that generates and outputs a weighting coefficient based on the surgical procedure information (information on the operation content) from the above and the color information from the color identification processing unit 12G.
In this case, the color identification processing unit 12G acquires the color in the image together with the position where the color is detected by performing color detection using Open CV (Open Source Computer Vision Library) or the like, and uses it as color information. Output. Examples of the color data format include RGB, YUB, YPbPr (YcbCr), HSV, and the like.
 本第5実施形態においては、データ処理部12Aは、図12に示した、取得データD0~D4のうち、少なくとも少なくとも基本データD0及び色情報データD4(図12の例においては、位置Aの色A、位置Bの色A、位置Cの色C)を取得していればよい。 In the fifth embodiment, the data processing unit 12A has at least at least basic data D0 and color information data D4 (in the example of FIG. 12, the color of position A) among the acquired data D0 to D4 shown in FIG. It suffices to acquire A, the color A of the position B, and the color C of the position C).
 ところで、手術を行う病院によっては、スタッフ(医師、看護師、手術室スタッフ等)の役割によって、服の色が決まっている場合がある。
 そこで、手術室システム20Aからの術式情報及び色識別処理部12Gからの色情報を重み付け係数付加部12Hに入力することで、服の色による各人物の役割の重み付けを変えて状況判定部12Bに出力する。
 これらの結果、状況判定部12Bは、人物の服の色を作業内容の推定に用いることができ、人物の役割及び作業内容の推定における推定が正しい確率を高くすることが可能となる。
By the way, depending on the hospital where the surgery is performed, the color of the clothes may be determined by the roles of the staff (doctors, nurses, operating room staff, etc.).
Therefore, by inputting the surgical procedure information from the operating room system 20A and the color information from the color identification processing unit 12G into the weighting coefficient addition unit 12H, the weighting of the role of each person according to the color of the clothes is changed and the situation determination unit 12B Output to.
As a result, the situation determination unit 12B can use the color of the person's clothes for estimating the work content, and can increase the probability that the estimation in estimating the role of the person and the work content is correct.
 これらの結果、本第5実施形態によれば、第2実施形態乃至第4実施形態と同様に、第1実施形態の効果に加えて、人物の役割及び作業内容の推定の正確性を向上させることができる。 As a result, according to the fifth embodiment, as in the second to fourth embodiments, in addition to the effect of the first embodiment, the accuracy of estimating the role of the person and the work content is improved. be able to.
[6]実施形態の変形例
 なお、本技術の実施の形態は、上述した実施の形態に限定されるものではなく、本技術の要旨を逸脱しない範囲において種々の変更が可能である。
[6] Modifications of the Embodiment The embodiment of the present technology is not limited to the above-described embodiment, and various changes can be made without departing from the gist of the present technology.
 以上の説明においては、判定用データベース13の更新については述べなかったが、新たに得られた症例について取得したデータを、随時判定用データベースに追加登録して更新するように構成することも可能である。 In the above description, the update of the judgment database 13 has not been described, but it is also possible to additionally register and update the newly obtained data in the judgment database at any time. be.
 以上の説明においては、管理サーバ14は、一つの装置として説明したが、複数の装置からなるクラウドサーバとして構築することも可能である。 In the above description, the management server 14 has been described as one device, but it can also be constructed as a cloud server composed of a plurality of devices.
 以上の説明においては、人物(例えば、術者)の役割を判定するためのデータ収集装置としてのセンサとしてカメラを用いる場合について説明したが、役割を判定できるセンサであれば、任意のセンサを用いることも可能である。 In the above description, a case where a camera is used as a sensor as a data collection device for determining the role of a person (for example, an operator) has been described, but any sensor can be used as long as it can determine the role. It is also possible.
 さらに、本技術は、以下の構成とすることも可能である。
(1)
 手術に関連する領域の撮像画像に含まれる人物又は物に関する情報を取得するデータ処理部と、
 前記人物又は物に関する情報に基づいて、前記人物の前記手術における役割を推定する状況判定部と、
 を備えた手術運用情報管理システム。
(2)
 前記データ処理部は、前記人物又は物に関する情報として、前記人物又は前記物の位置情報、前記人物又は前記物の判別結果及び前記人物の姿勢情報のうちの少なくとも一つを取得する、
 (1)に記載の手術運用情報管理システム。
(3)
 複数の症例について前記役割毎の前記人物の位置、姿勢、使用機材及び作業内容のうちの少なくとも一つを含む判定用データを記憶した判定用データベースを備え、
 前記状況判定部は、前記人物又は物に関する情報を前記判定用データと比較して、前記人物の前記手術における前記役割を推定する、
 (1)又は(2)に記載の手術運用情報管理システム。
(4)
 前記状況判定部は、前記手術の対象である患者と前記人物との位置関係、及び、前記人物と前記物との位置関係のうちの少なくとも一つに基づいて前記役割を推定する、
 (1)乃至(3)のいずれかに記載の手術運用情報管理システム。
(5)
 前記状況判定部は、前記手術の術式毎に前記手術の対象である患者に対する前記人物又は前記物の位置に基づいて前記役割を推定する、
 (1)乃至(4)のいずれかに記載の手術運用情報管理システム。
(6)
 前記状況判定部は、前記手術の術式毎に前記手術の対象である患者から前記人物又は前記物までのベクトルに基づいて前記役割を推定する、
 (1)乃至(4)のいずれかに記載の手術運用情報管理システム。
(7)
 前記状況判定部は、前記手術の術式毎に前記手術における各作業工程からの相対時間に基づいて前記役割を推定する、
 (1)乃至(4)のいずれかに記載の手術運用情報管理システム。
(8)
 前記状況判定部は、前記手術を担当する執刀医に対応付けられた術式を考慮して毎に前記役割を推定する、
 (1)乃至(4)のいずれかに記載の手術運用情報管理システム。
(9)
 前記状況判定部は、前記手術の術式毎に前記手術に使用する機材の稼働状況に基づいて前記役割を推定する、
 (1)乃至(4)のいずれかに記載の手術運用情報管理システム。
(10)
 前記手術に関連する領域を撮像することで前記撮像画像を取得するデータ収集装置を備える、
 (1)乃至(9)のいずれかに記載の手術運用情報管理システム。
(11)
 前記データ収集装置は、前記撮像画像を出力する複数のカメラを備える、
 (10)に記載の手術運用情報管理システム。
(12)
 前記役割に関する情報を提示する提示部を備えた、
 (1)乃至(11)のいずれかに記載の手術運用情報管理システム。
(13)
 前記人物は、術者である、
 (1)乃至(12)のいずれかに記載の手術運用情報管理システム。
(14)
 手術に関連する領域の撮像画像に含まれる人物又は物に関する情報を取得する過程と、
 前記人物又は物に関する情報に基づいて、前記人物の前記手術における役割を推定する過程と、
 を備えた手術運用情報管理方法。
(15)
 前記情報を取得する過程は、前記人物又は物に関する情報として、前記人物又は前記物の位置情報、前記人物又は前記物の判別結果及び前記人物の姿勢情報のうちの少なくとも一つを取得する、
 (14)に記載の手術運用情報管理方法。
(16)
 前記役割を推定する過程は、複数の症例について前記役割毎の前記人物の位置、姿勢、使用機材及び作業内容のうちの少なくとも一つを含む判定用データを記憶した判定用データベースを参照し、
 前記人物又は物に関する情報を前記判定用データと比較して、前記人物の前記手術における前記役割を推定する、
 (14)又は(15)に記載の手術運用情報管理方法。
(17)
 前記役割を推定する過程は、前記手術の対象である患者と前記人物との位置関係、及び、前記人物と前記物との位置関係のうちの少なくとも一つに基づいて前記役割を推定する、
 (14)乃至(16)のいずれかに記載の手術運用情報管理方法。
(18)
 前記役割を推定する過程は、前記手術の術式毎に前記手術の対象である患者に対する前記人物又は前記物の位置に基づいて前記役割を推定する、
 (14)乃至(17)のいずれかに記載の手術運用情報管理方法。
(19)
 前記役割を推定する過程は、前記手術の術式毎に前記手術の対象である患者から前記人物又は前記物までのベクトルに基づいて前記役割を推定する、
 (14)乃至(17)のいずれかに記載の手術運用情報管理方法。
(20)
 前記役割を推定する過程は、前記手術の術式毎に前記手術における各作業工程からの相対時間に基づいて前記役割を推定する、
 (14)乃至(17)のいずれかに記載の手術運用情報管理方法。
(21)
 前記役割を推定する過程は、前記手術を担当する執刀医に対応付けられた術式を考慮して毎に前記役割を推定する、
 (14)乃至(17)のいずれかに記載の手術運用情報管理方法。
(22)
 前記役割を推定する過程は、前記手術の術式毎に前記手術に使用する機材の稼働状況に基づいて前記役割を推定する、
 (14)乃至(17)のいずれかに記載の手術運用情報管理方法。
(23)
 前記役割に関する情報を提示する過程を備えた、
 (14)乃至(22)のいずれかに記載の手術運用情報管理方法。
(24)
 前記人物は、術者である、
 (14)乃至(23)のいずれかに記載の手術運用情報管理方法。
(25)
 手術運用情報管理システムをコンピュータにより制御するためのプログラムであって、
 前記コンピュータを
 手術に関連する領域の撮像画像に含まれる人物又は物に関する情報を取得する手段と、
 前記人物又は物に関する情報に基づいて、前記人物の前記手術における役割を推定する手段と、
 して機能させるプログラム。
(26)
 前記情報を取得する手段は、前記人物又は物に関する情報として、前記人物又は前記物の位置情報、前記人物又は前記物の判別結果及び前記人物の姿勢情報のうちの少なくとも一つを取得する、
 (25)に記載のプログラム。
(27)
 前記手術運用情報管理システムは、複数の症例について前記役割毎の前記人物の位置、姿勢、使用機材及び作業内容のうちの少なくとも一つを含む判定用データを記憶した判定用データベースを備え、
 前記役割を推定する手段は、前記人物又は物に関する情報を前記判定用データと比較して、前記人物の前記手術における前記役割を推定する、
 (25)又は(26)に記載のプログラム。
(28)
 前記役割を推定する手段は、前記手術の対象である患者と前記人物との位置関係、及び、前記人物と前記物との位置関係のうちの少なくとも一つに基づいて前記役割を推定する、
 (25)乃至(27)のいずれかに記載のプログラム。
(29)
 前記役割を推定する手段は、前記手術の術式毎に前記手術の対象である患者に対する前記人物又は前記物の位置に基づいて前記役割を推定する、
 (25)乃至(28)のいずれかに記載のプログラム。
(30)
 前記役割を推定する手段は、前記手術の術式毎に前記手術の対象である患者から前記人物又は前記物までのベクトルに基づいて前記役割を推定する、
 (25)乃至(28)のいずれかに記載のプログラム。
(31)
 前記役割を推定する手段は、前記手術の術式毎に前記手術における各作業工程からの相対時間に基づいて前記役割を推定する、
 (25)乃至(28)のいずれかに記載のプログラム。
(32)
 前記役割を推定する手段は、前記手術を担当する執刀医に対応付けられた術式を考慮して毎に前記役割を推定する、
 (25)乃至(28)のいずれかに記載のプログラム。
(33)
 前記役割を推定する手段は、前記手術の術式毎に前記手術に使用する機材の稼働状況に基づいて前記役割を推定する、
 (25)乃至(28)のいずれかに記載のプログラム。
(34)
 前記コンピュータを、前記役割に関する情報を提示する手段として機能させる、
 (15)乃至(33)のいずれかに記載のプログラム。
(35)
 前記人物は、術者である、
 (25)乃至(34)のいずれかに記載のプログラム。
Further, the present technology can also have the following configurations.
(1)
A data processing unit that acquires information about a person or object included in a captured image of an area related to surgery,
A situation determination unit that estimates the role of the person in the surgery based on the information about the person or the object.
Surgical operation information management system equipped with.
(2)
The data processing unit acquires at least one of the position information of the person or the object, the discrimination result of the person or the object, and the posture information of the person as the information about the person or the object.
The surgical operation information management system described in (1).
(3)
A judgment database that stores judgment data including at least one of the position, posture, equipment used, and work content of the person for each role for a plurality of cases is provided.
The situation determination unit estimates the role of the person in the surgery by comparing the information about the person or object with the determination data.
The surgical operation information management system according to (1) or (2).
(4)
The situation determination unit estimates the role based on at least one of the positional relationship between the patient and the person who is the target of the operation and the positional relationship between the person and the object.
The surgical operation information management system according to any one of (1) to (3).
(5)
The situation determination unit estimates the role for each surgical procedure based on the position of the person or the object with respect to the patient who is the target of the surgery.
The surgical operation information management system according to any one of (1) to (4).
(6)
The situation determination unit estimates the role for each surgical procedure based on a vector from the patient who is the target of the surgery to the person or the object.
The surgical operation information management system according to any one of (1) to (4).
(7)
The situation determination unit estimates the role for each surgical procedure based on the relative time from each work step in the surgery.
The surgical operation information management system according to any one of (1) to (4).
(8)
The situation determination unit estimates the role for each operation in consideration of the surgical procedure associated with the surgeon in charge of the operation.
The surgical operation information management system according to any one of (1) to (4).
(9)
The situation determination unit estimates the role based on the operating status of the equipment used for the surgery for each surgical procedure.
The surgical operation information management system according to any one of (1) to (4).
(10)
A data collecting device for acquiring the captured image by imaging the region related to the surgery is provided.
The surgical operation information management system according to any one of (1) to (9).
(11)
The data collection device includes a plurality of cameras that output the captured image.
The surgical operation information management system according to (10).
(12)
A presentation unit that presents information about the role.
The surgical operation information management system according to any one of (1) to (11).
(13)
The person is a surgeon,
The surgical operation information management system according to any one of (1) to (12).
(14)
The process of acquiring information about a person or object contained in a captured image of an area related to surgery,
The process of estimating the role of the person in the surgery based on the information about the person or object, and
Surgical operation information management method equipped with.
(15)
In the process of acquiring the information, at least one of the position information of the person or the object, the discrimination result of the person or the object, and the posture information of the person is acquired as the information about the person or the object.
The surgical operation information management method according to (14).
(16)
The process of estimating the role refers to a judgment database that stores judgment data including at least one of the position, posture, equipment used, and work content of the person for each role for a plurality of cases.
The role of the person in the surgery is estimated by comparing the information about the person or object with the determination data.
The surgical operation information management method according to (14) or (15).
(17)
The process of estimating the role estimates the role based on at least one of the positional relationship between the patient and the person to be operated on and the positional relationship between the person and the object.
The surgical operation information management method according to any one of (14) to (16).
(18)
In the process of estimating the role, the role is estimated based on the position of the person or the object with respect to the patient who is the target of the operation for each surgical procedure.
The surgical operation information management method according to any one of (14) to (17).
(19)
In the process of estimating the role, the role is estimated based on the vector from the patient who is the target of the operation to the person or the object for each surgical procedure.
The surgical operation information management method according to any one of (14) to (17).
(20)
The process of estimating the role estimates the role for each surgical procedure based on the relative time from each work step in the surgery.
The surgical operation information management method according to any one of (14) to (17).
(21)
In the process of estimating the role, the role is estimated for each operation in consideration of the surgical procedure associated with the surgeon in charge of the operation.
The surgical operation information management method according to any one of (14) to (17).
(22)
In the process of estimating the role, the role is estimated based on the operating status of the equipment used for the surgery for each surgical procedure.
The surgical operation information management method according to any one of (14) to (17).
(23)
With the process of presenting information about the role,
The surgical operation information management method according to any one of (14) to (22).
(24)
The person is a surgeon,
The surgical operation information management method according to any one of (14) to (23).
(25)
A program for controlling the surgical operation information management system with a computer.
A means for obtaining information about a person or an object included in a captured image of an area related to surgery using the computer.
Means for estimating the role of the person in the surgery based on information about the person or object, and
A program that works.
(26)
The means for acquiring the information acquires at least one of the position information of the person or the object, the discrimination result of the person or the object, and the posture information of the person as the information about the person or the object.
The program according to (25).
(27)
The surgical operation information management system includes a judgment database that stores judgment data including at least one of the position, posture, equipment used, and work content of the person for each role for a plurality of cases.
The means for estimating the role estimates the role of the person in the surgery by comparing the information about the person or object with the determination data.
The program according to (25) or (26).
(28)
The means for estimating the role estimates the role based on at least one of the positional relationship between the patient and the person to be operated on and the positional relationship between the person and the object.
The program according to any one of (25) to (27).
(29)
The means for estimating the role estimates the role based on the position of the person or the object with respect to the patient who is the target of the operation for each surgical procedure.
The program according to any one of (25) to (28).
(30)
The means for estimating the role estimates the role for each surgical procedure based on a vector from the patient who is the target of the surgery to the person or the object.
The program according to any one of (25) to (28).
(31)
The means for estimating the role estimates the role for each surgical procedure based on the relative time from each work step in the surgery.
The program according to any one of (25) to (28).
(32)
The means for estimating the role is to estimate the role for each operation in consideration of the surgical procedure associated with the surgeon in charge of the operation.
The program according to any one of (25) to (28).
(33)
The means for estimating the role estimates the role based on the operating status of the equipment used for the surgery for each surgical procedure.
The program according to any one of (25) to (28).
(34)
Acting the computer as a means of presenting information about the role.
The program according to any one of (15) to (33).
(35)
The person is a surgeon,
The program according to any one of (25) to (34).
 10  手術運用情報管理システム
 11  データ収集装置
 11A カメラ
 12  データ処理装置
 12A データ処理部
 12B 状況判定部
 12C 患者推定処理部
 12D~12F 重み付け係数付加部
 12G 色識別処理部
 12H 重み付け係数付加部
 13  判定用データベース
 14  管理サーバ
 15  情報提示システム
 15A 情報提示処理部
 15B ディスプレイ
 20  院内システム
 20A 手術室システム
 21  症例別データ
 31  役割データ
 32  作業内容データ
 33  時刻データ
 41  手術室内機器
 D0  基本データ
 D1  術式データ
 D2  執刀医データ
 D3  機材情報データ
 D4  色情報データ
 DA1 距離データ
 DA2 距離データ
 DA3 ベクトルデータ
 DR  判定用データ
10 Surgical operation information management system 11 Data collection device 11A Camera 12 Data processing device 12A Data processing unit 12B Situation judgment unit 12C Patient estimation processing unit 12D-12F Weighting coefficient addition unit 12G Color identification processing unit 12H Weighting coefficient addition unit 13 Judgment database 14 Management server 15 Information presentation system 15A Information presentation processing unit 15B Display 20 In-hospital system 20A Operating room system 21 Case-specific data 31 Role data 32 Work content data 33 Time data 41 Operating room equipment D0 Basic data D1 Operational data D2 Surgeon data D3 Equipment information data D4 Color information data DA1 Distance data DA2 Distance data DA3 Vector data DR Judgment data

Claims (15)

  1.  手術に関連する領域の撮像画像に含まれる人物又は物に関する情報を取得するデータ処理部と、
     前記人物又は物に関する情報に基づいて、前記人物の前記手術における役割を推定する状況判定部と、
     を備えた手術運用情報管理システム。
    A data processing unit that acquires information about a person or object included in a captured image of an area related to surgery,
    A situation determination unit that estimates the role of the person in the surgery based on the information about the person or the object.
    Surgical operation information management system equipped with.
  2.  前記データ処理部は、前記人物又は物に関する情報として、前記人物又は前記物の位置情報、前記人物又は前記物の判別結果及び前記人物の姿勢情報のうちの少なくとも一つを取得する、
     請求項1に記載の手術運用情報管理システム。
    The data processing unit acquires at least one of the position information of the person or the object, the discrimination result of the person or the object, and the posture information of the person as the information about the person or the object.
    The surgical operation information management system according to claim 1.
  3.  複数の症例について前記役割毎の前記人物の位置、姿勢、使用機材及び作業内容のうちの少なくとも一つを含む判定用データを記憶した判定用データベースを備え、
     前記状況判定部は、前記人物又は物に関する情報を前記判定用データと比較して、前記人物の前記手術における前記役割を推定する、
     請求項1に記載の手術運用情報管理システム。
    A judgment database that stores judgment data including at least one of the position, posture, equipment used, and work content of the person for each role for a plurality of cases is provided.
    The situation determination unit estimates the role of the person in the surgery by comparing the information about the person or object with the determination data.
    The surgical operation information management system according to claim 1.
  4.  前記状況判定部は、前記手術の対象である患者と前記人物との位置関係、及び、前記人物と前記物との位置関係のうちの少なくとも一つに基づいて前記役割を推定する、
     請求項1に記載の手術運用情報管理システム。
    The situation determination unit estimates the role based on at least one of the positional relationship between the patient and the person who is the target of the operation and the positional relationship between the person and the object.
    The surgical operation information management system according to claim 1.
  5.  前記状況判定部は、前記手術の術式毎に前記手術の対象である患者に対する前記人物又は前記物の位置に基づいて前記役割を推定する、
     請求項1に記載の手術運用情報管理システム。
    The situation determination unit estimates the role for each surgical procedure based on the position of the person or the object with respect to the patient who is the target of the surgery.
    The surgical operation information management system according to claim 1.
  6.  前記状況判定部は、前記手術の術式毎に前記手術の対象である患者から前記人物又は前記物までのベクトルに基づいて前記役割を推定する、
     請求項1に記載の手術運用情報管理システム。
    The situation determination unit estimates the role for each surgical procedure based on a vector from the patient who is the target of the surgery to the person or the object.
    The surgical operation information management system according to claim 1.
  7.  前記状況判定部は、前記手術の術式毎に前記手術における各作業工程からの相対時間に基づいて前記役割を推定する、
     請求項1に記載の手術運用情報管理システム。
    The situation determination unit estimates the role for each surgical procedure based on the relative time from each work step in the surgery.
    The surgical operation information management system according to claim 1.
  8.  前記状況判定部は、前記手術を担当する執刀医に対応付けられた術式を考慮して前記役割を推定する、
     請求項1に記載の手術運用情報管理システム。
    The situation determination unit estimates the role in consideration of the surgical procedure associated with the surgeon in charge of the surgery.
    The surgical operation information management system according to claim 1.
  9.  前記状況判定部は、前記手術の術式毎に前記手術に使用する機材の稼働状況に基づいて前記役割を推定する、
     請求項1に記載の手術運用情報管理システム。
    The situation determination unit estimates the role based on the operating status of the equipment used for the surgery for each surgical procedure.
    The surgical operation information management system according to claim 1.
  10.  前記手術に関連する領域を撮像することで前記撮像画像を取得するデータ収集装置を備える、
     請求項1に記載の手術運用情報管理システム。
    A data collecting device for acquiring the captured image by imaging the region related to the surgery is provided.
    The surgical operation information management system according to claim 1.
  11.  前記データ収集装置は、前記撮像画像を出力する複数のカメラを備える、
     請求項10に記載の手術運用情報管理システム。
    The data collection device includes a plurality of cameras that output the captured image.
    The surgical operation information management system according to claim 10.
  12.  前記役割に関する情報を提示する提示部を備えた、
     請求項1に記載の手術運用情報管理システム。
    A presentation unit that presents information about the role.
    The surgical operation information management system according to claim 1.
  13.  前記人物は、術者である、
     請求項1に記載の手術運用情報管理システム。
    The person is a surgeon,
    The surgical operation information management system according to claim 1.
  14.  手術に関連する領域の撮像画像に含まれる人物又は物に関する情報を取得する過程と、
     前記人物又は物に関する情報に基づいて、前記人物の前記手術における役割を推定する過程と、
     を備えた手術運用情報管理方法。
    The process of acquiring information about a person or object contained in a captured image of an area related to surgery,
    The process of estimating the role of the person in the surgery based on the information about the person or object, and
    Surgical operation information management method equipped with.
  15.  手術運用情報管理システムをコンピュータにより制御するためのプログラムであって、
     前記コンピュータを
     手術に関連する領域の撮像画像に含まれる人物又は物に関する情報を取得する手段と、
     前記人物又は物に関する情報に基づいて、前記人物の前記手術における役割を推定する手段と、
     して機能させるプログラム。
    A program for controlling the surgical operation information management system with a computer.
    A means for obtaining information about a person or an object included in a captured image of an area related to surgery using the computer.
    Means for estimating the role of the person in the surgery based on information about the person or object, and
    A program that works.
PCT/JP2021/015944 2020-04-27 2021-04-20 System, method, and program for surgery operating information management WO2021220872A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-078720 2020-04-27
JP2020078720 2020-04-27

Publications (1)

Publication Number Publication Date
WO2021220872A1 true WO2021220872A1 (en) 2021-11-04

Family

ID=78331525

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/015944 WO2021220872A1 (en) 2020-04-27 2021-04-20 System, method, and program for surgery operating information management

Country Status (1)

Country Link
WO (1) WO2021220872A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010119892A1 (en) * 2009-04-14 2010-10-21 株式会社ホギメディカル Analysis equipment and program for analysis equipment
JP2015029791A (en) * 2013-08-05 2015-02-16 株式会社東芝 Medical appliance operation support device and ultrasonic diagnostic device
JP2017033047A (en) * 2015-07-28 2017-02-09 株式会社コンピュータシステム研究所 Safety management support device, safety management support program, and storage medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010119892A1 (en) * 2009-04-14 2010-10-21 株式会社ホギメディカル Analysis equipment and program for analysis equipment
JP2015029791A (en) * 2013-08-05 2015-02-16 株式会社東芝 Medical appliance operation support device and ultrasonic diagnostic device
JP2017033047A (en) * 2015-07-28 2017-02-09 株式会社コンピュータシステム研究所 Safety management support device, safety management support program, and storage medium

Similar Documents

Publication Publication Date Title
US11367304B2 (en) Method and system for surgical instrumentation setup and user preferences
US20220336078A1 (en) System and method for tracking a portion of the user as a proxy for non-monitored instrument
KR102458587B1 (en) Universal device and method to integrate diagnostic testing into treatment in real-time
JP4296278B2 (en) Medical cockpit system
US20230023635A1 (en) Hub identification and tracking of objects and personnel within the or to overlay data that is custom to the user's need
US11779423B2 (en) Real-time adjustment of haptic feedback in surgical robots
JP4776937B2 (en) Surgery system
WO2017151904A1 (en) Methods and systems for anatomical image registration
WO2021220872A1 (en) System, method, and program for surgery operating information management
WO2023002661A1 (en) Information processing system, information processing method, and program
WO2022249097A2 (en) Adaptive control of operating room systems
WO2022249100A1 (en) Efficiency of motion monitoring and analysis for a surgical procedure
WO2022249084A1 (en) Aggregated network of surgical hubs for efficiency analysis
EP4293680A1 (en) Systems and methods for monitoring surgical workflow and progress
US20220384018A1 (en) Ergonomic monitoring and analysis for an operating room
US20210158947A1 (en) Beacon-based systems and methods for generating medical facility metrics
WO2022249093A2 (en) Ergonomic monitoring and analysis for an operating room
WO2022219491A1 (en) System and method for tracking a portion of the user as a proxy for non-monitored instrument
WO2022249103A1 (en) Monitoring a health care professional movement relative to a virtual boundary in an operating room
JP2002119520A (en) Surgical system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21795801

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21795801

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP