WO2024090228A1 - Dispositif de traitement d'informations, système de traitement d'informations et programme de traitement d'informations - Google Patents

Dispositif de traitement d'informations, système de traitement d'informations et programme de traitement d'informations Download PDF

Info

Publication number
WO2024090228A1
WO2024090228A1 PCT/JP2023/037094 JP2023037094W WO2024090228A1 WO 2024090228 A1 WO2024090228 A1 WO 2024090228A1 JP 2023037094 W JP2023037094 W JP 2023037094W WO 2024090228 A1 WO2024090228 A1 WO 2024090228A1
Authority
WO
WIPO (PCT)
Prior art keywords
room terminal
terminal
instruction
room
operating room
Prior art date
Application number
PCT/JP2023/037094
Other languages
English (en)
Japanese (ja)
Inventor
伊知朗 竹政
雅彦 篠原
雅夫 渡邉
義則 長谷川
伸 伊藤
由祐子 篠原
Original Assignee
伊知朗 竹政
雅彦 篠原
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 伊知朗 竹政, 雅彦 篠原 filed Critical 伊知朗 竹政
Publication of WO2024090228A1 publication Critical patent/WO2024090228A1/fr

Links

Definitions

  • This disclosure relates to an information processing device, an information processing system, and an information processing program that are connected to an operating room terminal, an instruction room terminal, and a participant room terminal via a network.
  • Surgery is videotaped by medical association certified doctors and the footage is stored on a recording device, such as a hard disk drive (HDD) or digital versatile disc (DVD).
  • a recording device such as a hard disk drive (HDD) or digital versatile disc (DVD).
  • Medical association certified doctors are called expert doctors. Young doctors can view the footage of surgeries recorded on these recording devices to hone their surgical skills.
  • An example of this technology is described in Patent Document 1.
  • Patent Document 1 An imaging means configured to be able to control imaging conditions and to image an affected area; an acquisition means for acquiring information regarding the imaging conditions set in the imaging means; a control means for controlling communication conditions for transmitting image data corresponding to an imaging result of the affected area by the imaging means, via a network, based on the imaging conditions;
  • a medical imaging system comprising: is stated.
  • Patent Document 1 The inventors of this application recognized that the technology described in Patent Document 1 does not allow switching between the operating room terminal, the teaching room terminal, and the participating room terminal with which information including video, audio, and annotations can be exchanged.
  • the objective of the present disclosure is to provide an information processing device, information processing system, and information processing program that can switch between the operating room terminal, the instruction room terminal, and the participating room terminal with which information including video, audio, and annotations can be mutually communicated.
  • the information processing device disclosed herein comprises an operating room terminal provided in an operating room and having a display unit for displaying video and annotations, a microphone for acquiring audio within the operating room, and a speaker for outputting audio acquired from outside the operating room; an instruction room terminal provided in an instruction room separated from the operating room and having a display unit for displaying video and annotations, a microphone for acquiring audio within the instruction room, and a speaker for outputting audio acquired from outside the instruction room; and a participation room terminal provided in a participation room separated from the operating room and the instruction room and having a display unit for displaying video and annotations, a microphone for acquiring audio within the participation room, and a speaker for outputting audio acquired from outside the participation room; a communication unit connected to each of the above via a network; a memory unit in which a non-transient program is stored; and a communication unit connected to the communication unit and the memory unit.
  • control circuit connected to the storage unit and configured to execute processing when a non-transient program connected to the storage unit is started
  • the control circuit processes video acquired from the operating room terminal, the instruction room terminal, and the participating room terminal, processes audio acquired from the operating room terminal, the instruction room terminal, and the participating room terminal, processes instruction and annotation acquired from the instruction room terminal, and processes switching a transmission state of information including video, audio, and annotation between the operating room terminal, the instruction room terminal, and the participating room terminal and the communication unit
  • the control circuit switches and starts up among a first mode, a second mode, and a third mode in which the transmission state of information is different, and when the first mode is started up, audio can be communicated bidirectionally between the operating room terminal and the instruction room terminal, and a front end of the operating room terminal can be transmitted from the operating room terminal to the communication unit.
  • audio can be transmitted unidirectionally to the participating room terminal, audio can be transmitted unidirectionally from the instruction room terminal to the participating room terminal, annotations can be transmitted unidirectionally from the instruction room terminal to the operating room terminal, and annotations can be transmitted unidirectionally from the instruction room terminal to the participating room terminal, and annotations can be transmitted unidirectionally from the instruction room terminal to the participating room terminal, and when the second mode is activated, audio can be communicated bidirectionally between the participating room terminal and the instruction room terminal, audio can be transmitted unidirectionally from the operating room terminal to the instruction room terminal, audio cannot be communicated between the participating room terminal and the operating room terminal, annotations can be communicated bidirectionally between the instruction room terminal and the participating room terminal, annotations cannot be communicated between the participating room terminal and the operating room terminal, and annotations cannot be communicated between the instruction room terminal and the operating room terminal, and annotations cannot be communicated between the instruction room terminal and the operating room terminal, and when the third mode is activated, , an information processing device capable
  • the operating room terminal during surgery, it is possible to switch between the operating room terminal, the instruction room terminal, and the participant room terminal, with which information including video, audio, and annotations can be exchanged.
  • a surgeon using an operating room terminal can reduce the stress he or she experiences when the situation in the operating room is viewed from the outside, i.e., by an instructor using an instruction room terminal and participants using participant room terminals.
  • appropriate instruction and annotations can be provided from the instruction room terminal to the operating room terminal according to the surgery being performed by the surgeon.
  • FIG. 1 is a block diagram showing a first specific example of a remote instruction system which is an example of an information processing system according to the present disclosure.
  • FIG. 2 is a block diagram showing an example of a server included in the remote instruction system.
  • FIG. 2 is a block diagram showing an example of an operating room included in the remote guidance system.
  • 1 is a block diagram showing an example of a teaching room included in a remote teaching system.
  • FIG. 2 is a block diagram showing an example of a participation room in a remote instruction system.
  • FIG. 2A is a schematic diagram showing an example of a display on a monitor installed in an operating room.
  • FIG. 2B is a schematic diagram showing an example of a display on a monitor installed in a teaching room.
  • FIG. 11 is a block diagram showing a second specific example of the remote instruction system of the present disclosure.
  • 1 is a flowchart illustrating an example of a remote instruction method performed by the remote instruction system of the present disclosure.
  • 13 is an example of a screen displayed on a monitor of a terminal in a teaching room.
  • FIG. 2 is a schematic diagram showing a mode switching function in the present disclosure.
  • FIG. 13 is a conceptual diagram showing an API when a proctor mode is selected.
  • FIG. 13 is a conceptual diagram showing APIs when a demo mode is selected.
  • FIG. 13 is a conceptual diagram showing APIs when a full mode is selected.
  • 11 is a conceptual diagram showing a sequence process performed when an icon on a teaching room terminal is operated in the present disclosure.
  • an operating room terminal used by a subject receiving medical, specifically, surgery, instruction, and an instruction room terminal used by an instructor who instructs the subject are connected via a network.
  • a remote instruction system is disclosed as an example of an information processing system.
  • a remote instruction program is disclosed as an example of an information processing program.
  • the main purpose of the remote instruction system of the present disclosure is to use it for so-called remote proctoring, in which a surgeon performs surgery on a patient in an operating room and provides real-time instruction on the surgery from a location away from the operating room.
  • Images used in the remote instruction system include both 2D images and 3D images.
  • the images include annotations and pointers.
  • the 3D images are, for example, stereotypes.
  • the remote instruction system of the present disclosure includes the following use examples.
  • the remote instruction system disclosed herein can be used in the first scenario, for example, in real-time instruction in a "one-on-one" situation.
  • the first "one” is a young surgeon
  • the second "one” is a supervising physician.
  • the young surgeon submits video footage taken during surgery as proof of skill to the various academic societies.
  • the young surgeon can use the remote instruction system during actual surgery and receive remote instruction from the supervising physician.
  • the young surgeon can use the video footage actually taken during surgery as proof of skill.
  • the remote instruction system disclosed herein can be used in a second scenario, for example, a video clinic in a "1 vs. n" situation.
  • "1" is a young surgeon.
  • "n” are instructors other than the young surgeon, and "n” is an integer equal to or greater than 1.
  • a young surgeon submits video footage as proof of skills to become an academic society-certified specialist.
  • the remote instruction system records footage of the young surgeon performing surgery without guidance from an instructor, and shares the footage with instructors, colleagues, surgeons affiliated with other medical facilities, etc., allowing the young surgeon to receive guidance and hold discussions.
  • the remote guidance system disclosed herein can be used in a third scene, for example, in a live open academic conference in a "1 vs. 1 vs. n" situation.
  • the first "1" is the surgeon
  • the second "1” is the supervising physician
  • the "n" are the participants.
  • the remote guidance system provides information including video of the surgery to academic conferences, open lectures, and other venues while the surgeon is performing the surgery.
  • the remote guidance system has a function for the instructor to provide guidance to the surgeon and display annotations and a pointer while the surgeon is performing the surgery.
  • the remote guidance system disclosed herein has a function for providing guidance, displaying annotations, displaying a pointer, and transmitting the information to the surgeon and participants.
  • the remote guidance system disclosed herein has a function for not transmitting video and audio including discussions between the supervising physician and participants to the surgeon.
  • the remote instruction system of the present disclosure has a function that can be used in a video-on-demand situation.
  • the remote instruction system of the present disclosure has a function to automatically record and upload information including, for example, video, instruction, annotations, and pointers obtained in the first scene, the second scene, and the third scene.
  • the remote instruction system of the present disclosure has a function to upload files including various information.
  • the remote instruction system of the present disclosure has a function to manage uploaded videos.
  • the remote instruction system of the present disclosure has a function to manage viewing of uploaded videos.
  • the remote instruction system of the present disclosure has a function to manage sharing of uploaded videos.
  • the remote instruction system disclosed herein has the function of live-streaming information including the video, instruction, annotations, and pointers obtained in the third scene in one direction to multiple users simultaneously.
  • This disclosure can be used in cases such as training specialists, resolving the shortage of surgeons, researching and learning advanced surgical cases, medical training in rural areas, and instructors remotely teaching subjects. This disclosure can also be used when actually performing surgery at medical institutions. It is also effective in creating career paths for female surgeons and a medical service provision foundation with minimal regional disparities.
  • the first specific example is an example in which surgery is performed on a patient in an operating room, and surgery guidance is given in real time from a location away from the operating room. More specifically, the example is an example in which surgery is performed manually by a person involved in the surgery while the patient is lying on an operating table.
  • the first specific example is shown in Figures 1A, 1B, 1C, 1D, 1E, 2A, and 2B.
  • (Explanation of overall configuration) 1A includes a server 11, an operating room terminal 13, an instruction room terminal 14, and a participating room terminal 43.
  • the server 11, the operating room terminal 13, the instruction room terminal 14, and the participating room terminal 43 are connected to each other via a network 12.
  • the server 11 is also connected to a processing terminal 81 via the network 12.
  • the network 12 includes the Internet, an intranet, a wireless LAN (Local Area Network), and a mobile phone system network such as 4G or 5G.
  • the operating room terminal 13 and the teaching room terminal 14 can be indirectly connected via the server 11.
  • the operating room terminal 13 and the teaching room terminal 14 can also bypass the server 11 and directly communicate bidirectionally.
  • FIG. 1A shows an example in which a single operating room terminal 13 is connected to the server 11, a single teaching room terminal 14 is connected to the server 11, and a single participating room terminal 43 is connected to the server 11, but multiple operating room terminals 13 may be connected to the server 11. Also, multiple teaching room terminals 14 may be connected to the server 11. Also, multiple participating room terminals 43 may be connected to the server 11.
  • the network 12 is constructed by at least one of wireless communication and wired communication. Note that wireless communication includes Wi-Fi (registered trademark).
  • the communication circuits for signals including information such as video, images, audio, and data are constructed using a proprietary interface and compression transmission technology.
  • the proprietary interface includes an encoder for signal processing, and a browser and application for connecting to the server 11 using an Internet line.
  • the operating room terminal 13 is used by the subject who receives the instruction on surgery.
  • the subjects include the surgeon, nurses, assistants, etc.
  • the operating room terminal 13 is provided, for example, in the operating room 15.
  • the instruction room terminal 14 is used by the instructor who provides the instruction on surgery to the subject.
  • the instruction room terminal 14 is provided in the instruction room 16.
  • the operating room 15 and the instruction room 16 are, for example, located in different places. The different places may be different countries, different prefectures, different cities, towns, villages, different buildings, different rooms in the same building, etc.
  • the participating room terminal 43 is used by a trainee doctor who views the situation in the operating room 15 and studies the instruction of the instructor.
  • the participating room terminal 43 is used in a place other than the operating room 15 and the instruction room 16.
  • An example of the configuration of the server 11, the configuration of the operating room 15, the configuration of the instruction room 16, and the configuration of the participating room terminal 43 is as follows.
  • the server 11 has a control circuit 17, an operation unit 18, a monitor 19, a communication unit 20, and a storage unit 22.
  • the server 11 is provided in a location different from the operating room 15 and the instruction room 16.
  • the control circuit 17 has a video processing unit 82, a voice processing unit 83, an annotation processing unit 84, a robot information processing unit 85, an auxiliary information processing unit 86, a switching unit 105, and a mode selection support unit 92.
  • the control circuit 17 performs various processes and judgments based on various information acquired from the operating room terminal 13, various information acquired from the instruction room terminal 14, and various information acquired from the participation room terminal 43, and can store the processing results and judgment results in the database of the storage unit 22.
  • the various information includes, for example, video, audio, annotations, pointers, folders, files, data, etc.
  • the control circuit 17 may convert audio information into text in real time and interpret it into a foreign language such as English.
  • the interpretation method may be automatic interpretation using artificial intelligence technology such as deep learning.
  • the video processing unit 82 has a function to process video acquired from the operating room terminal 13, the instruction room terminal 14, and the participating room terminal 43, and video read from the memory unit 22.
  • the audio processing unit 83 has a function to process audio acquired from the operating room terminal 13, the instruction room terminal 14, and the participating room terminal 43, and audio read from the memory unit 22.
  • the annotation processing unit 84 has a function to process annotations acquired from the operating room terminal 13, the instruction room terminal 14, and the participating room terminal 43, and annotations read from the memory unit 22.
  • the annotation processing unit 84 also has a function to associate information including the operation of the robot main body 48 acquired by the robot information processing unit 85 with annotation information generated by the instruction room terminal 14 and store it in the memory unit 22.
  • the robot information processing unit 85 has the function of acquiring information, including the operation of the master control 49 and the operation of the robot main body 48, from the operating room terminal 13, and processing the acquired information.
  • the auxiliary information processing unit 86 can generate auxiliary information with characteristics that change the medical fee depending on the type of mode activated by the switching unit 105. The medical fee will be described later.
  • the communication unit 20 is connected to the network 12. Signals received by the communication unit 20 via the network 12 are input to the control circuit 17. Furthermore, signals output from the communication unit 20 are transmitted via the network 12 to the operating room terminal 13, the instruction room terminal 14, and the participation room terminal 43.
  • the operation unit 18 is constructed with at least one element of a touch switch, a display screen such as a liquid crystal panel, a keyboard, a mouse, etc.
  • a signal corresponding to the operation content is input to the control circuit 17.
  • the monitor 19 has a display screen.
  • the display screen of the monitor 19 is constructed with a touch panel, a liquid crystal panel, etc.
  • the content displayed on the display screen of the monitor 19 is controlled by the control circuit 17.
  • the display screen of the operation unit 18 and the display screen of the monitor 19 may be provided separately or may be integrated.
  • the storage unit 22 may be either an internal type provided inside the main body constituting the server 11, or an external type provided outside the main body.
  • the external storage unit 22 may be a hard disk, DVD, stick memory, etc.
  • the storage unit 22 stores non-temporary programs, data, information, etc. used for processing and judgment performed by the control circuit 17.
  • the learning data for machine learning stored in the storage unit 22 may use data from natural language processing performed so far. This allows foreign surgeons to participate, and is expected to improve their surgical skills.
  • the storage unit 22 stores applications that are started by the processing, judgment, control, etc. performed by the control circuit 17, and stores various information, data, etc. used for processing, judgment, control, etc. performed by the control circuit 17.
  • the storage unit 22 also stores various information and data, etc. to be sent to the operating room terminal 13, the instruction room terminal 14, and the participation room terminal 43.
  • the storage unit 22 also stores evaluation criteria to be sent to the participation room terminal 43.
  • the mode selection support unit 92 has a function of supporting the mode selection operation performed on the instruction room terminal 14.
  • the mode selection support unit 92 can perform the following support processing:
  • the mode selection support unit 92 stores the mode selected at the instruction room terminal 14 as data for each medical condition, each surgical content, each surgeon, and each participant in the storage unit 22.
  • the mode selection support unit 92 also stores the number of instructions and the time required for the instructions given from the instruction room terminal 14 to the operating room terminal 13 and the participating room terminal 43, the number of annotations and the time required, etc., for each surgical content, medical condition, each surgeon, and each participant in the storage unit 22. Thereafter, when an operation is performed in the operating room 15, the mode selection support unit 92 recommends, from the server 11 to the instruction room terminal 14, the mode in which the operation proceeded most smoothly, as an appropriate mode, according to the surgical content, case, surgeon, etc., from the data stored in the storage unit 22.
  • the mode selection support unit 92 can determine, for example, that the mode selected when the time required from the start of surgery to the end of surgery was the shortest, the number of instructions given from the instruction room terminal 14 to the operating room terminal 13 during surgery and the time required for these instructions were the shortest, and the number of annotations made and the time required for these instructions were the shortest, as "the mode in which the surgery proceeded most smoothly.”
  • the mode selection support unit 92 stores in the storage unit 22 the relationship between the mode selected at the instruction room terminal 14 for each surgical content, each medical condition, each surgeon, each participant, etc., as data.
  • the mode selection support unit 92 also acquires from the participation room terminal 43 an evaluation of the instruction and annotation sent from the instruction room terminal 14 to the participation room terminal 43, and stores the acquired evaluation in the storage unit 22. The evaluation acquired from the participation room terminal 43 will be described later. Then, at the time of surgery performed after the evaluation is stored in the storage unit 22, the mode selection support unit 92 performs a process of recommending the mode selected at the time of the surgery that received the highest evaluation from the server 11 to the instruction room terminal 14 according to the surgical content, case, each surgeon, each participant, etc.
  • the mode selection support unit 92 Before the instructor operates the training room terminal 14 to select a mode, the mode selection support unit 92 performs a process of acquiring a desired mode from at least one of the operation room terminal 13 and the participation room terminal 43. Then, the mode selection support unit 92 performs a process of recommending the acquired desired mode to the training room terminal 14.
  • the instructor can operate the operation unit 18 of the training room terminal 14 according to the first to third scenes described above to select one of a plurality of modes, for example, the first to third modes.
  • the instructor can also select one of the first to third modes based on information recommended by the mode selection support unit 92.
  • the switching unit 105 has a function of activating the selected mode.
  • the first to third modes differ in the state of transmission of various information between the operating room terminal 13 and the instruction room terminal 14, the state of transmission of various information between the instruction room terminal 14 and the participating room terminal 43, and the state of transmission of various information between the operating room terminal 13 and the participating room terminal 43.
  • the switching unit 105 includes a proctor mode circuit 120, a demo mode circuit 121, and a full mode circuit 122. The specific functions of the proctor mode circuit 120, the demo mode circuit 121, and the full mode circuit 122, and specific examples of the first to third modes will be described later.
  • the operating room 15 is provided with an operating room terminal 13, an operating field camera 23, a first camera 24, a second camera 25, a monitor 27, a speaker 28, a microphone 29, and an operation unit 30.
  • the operating field camera 23 includes an endoscope, a laparoscopic camera, and a robot camera.
  • the operating room terminal 13 is a computer used by a subject who receives instruction from an instructor. The subjects include a surgeon performing the surgery, a trainee, a nurse, an assistant for the surgery, etc.
  • the operating room terminal 13 controls and manages the monitor 27, the operating field camera 23, the first camera 24, the second camera 25, the microphone 29, the speaker 28, etc.
  • the operating room terminal 13 has a control circuit 31, a memory unit 32, and a communication unit 33.
  • the control circuit 31 is mainly composed of, for example, a video card (also called a graphics board), and has an encoder and a decoder that convert signals into data, an arithmetic circuit that performs various processes, comparisons, and calculations, and a memory that temporarily stores information.
  • the operating room terminal 13 is connected to the operation unit 30, the monitor 27, the surgical field camera 23, the first camera 24, the second camera 25, the microphone 29, and the speaker 28 so as to be able to transmit and receive signals.
  • the communication unit 33 receives a signal transmitted from the server 11 via the network 12.
  • the signal received by the communication unit 33 is processed by the control circuit 31.
  • the signal output from the communication unit 33 is transmitted to the server 11 via the network 12.
  • the signal input to the communication unit 33 and the signal output from the communication unit 33 include information such as video, images, audio, data, annotations, and pointers.
  • the monitor 27 constructs a display screen using a touch panel.
  • the touch panel may be of any of the following types: capacitive, resistive, or infrared.
  • the monitor 27 and the operating room terminal 13 are connected by a communication line, such as HDMI (registered trademark (High-Definition Multimedia Interface: registered trademark)), USB (Universal Serial Bus), etc. Between the monitor 27 and the operating room terminal 13, information signals including video, images, audio, etc. are communicated bidirectionally.
  • FIG. 2A shows an example of a display screen 34 of the monitor 27.
  • the display screen 34 can display a display section 34A that displays an image captured by the surgical field camera 23, a display section 34B that displays an image captured by the first camera 24, a display section 34C that displays an image captured by the second camera 25, and a display section 34D that displays an image transmitted from the instruction room terminal 14.
  • the display screen 34 can also display an operation menu.
  • the display screen 34 is compatible with high-resolution images.
  • the operation unit 30 is constructed with at least one element of a keyboard, a mouse, and a display screen of a touch panel.
  • the operation unit 30 is connected to a control circuit 31 of the operating room terminal 13.
  • a signal corresponding to the operation is input to the control circuit 31.
  • the display screen of the operation unit 30 and the display screen 34 of the monitor 27 may be provided separately or may be integrated.
  • the display screen 34 of the monitor 27 shown in FIG. 2A is an example integrated with the display screen of the operation unit 30.
  • the memory unit 32 may be either an internal type provided inside the main body constituting the operating room terminal 13, or an external type provided outside the main body. Furthermore, the external memory unit 32 may be a hard disk, a DVD (Digital Versatile Disc), a stick memory, etc.
  • the memory unit 32 stores non-temporary programs, data, information, etc. used for processing, judgment, control, etc. performed by the control circuit 31.
  • the endoscope (lapascope) 23 and the robot body 48 are medical devices intended to observe the inside of the patient who is the subject of surgery, that is, the human body.
  • the surgical field camera 23 may have a structure and function that allows it to perform specified surgeries and collect specimens.
  • the surgical field camera 23 converts images of the inside of the human body, specifically the surgical field, into a signal and outputs the signal.
  • the surgical field camera 23 may also have a microphone.
  • the surgical field camera 23 and the operating room terminal 13 are connected by a communication line, for example, SDI (Serial Digital Interface), and digital video signals and digital audio signals are transmitted from the surgical field camera 23 to the operating room terminal 13.
  • SDI Serial Digital Interface
  • the first camera 24 and the second camera 25 convert the video captured in the operating room 15 into a signal and output it.
  • the first camera 24 and the second camera 25 are cameras that overlook the operating room 15.
  • the first camera 24, for example, captures video of the entire operating room 15.
  • the shooting range (space) of the second camera 25 is narrower than the shooting range of the first camera 24.
  • the first camera 24 and the second camera 25 differ in their installation locations, shooting locations, shooting directions, etc.
  • the first camera 24 and the second camera 25 are connected in parallel to the operating room terminal 13, for example, by HDMI.
  • the first camera 24 and the second camera 25 capture video of the entire team, including surgical assistants and nurses, i.e., the position of the surgeon, the position of the assistant, the movement of the assistant's forceps, etc.
  • the first camera 24 and the second camera 25 may be installed in the operating room 15 in any manner, such as being fixed, movable, or held by the subject's hand.
  • signals including electronic medical record information signals including CT (Computed Tomography) images, signals including MRI (Magnetic Resonance Imaging) images, signals including echo (ultrasound) images, etc. may be input to the operating room terminal 13, and these signals may be replaced with the signal from the first camera 24 or the signal from the second camera 25.
  • the microphone 29 is provided in the operating room 15, and converts the sound in the operating room 15 into a signal and outputs it.
  • the signal output from the microphone 29 is transmitted to the operating room terminal 13.
  • the microphone 29 may be provided in either the first camera 24 or the second camera 25, or may be provided independently of the first camera 24 and the second camera 25.
  • the speaker 28 is connected to the operating room terminal 13. When a signal including audio is input from the instruction room terminal 14 to the operating room terminal 13 via the network 12, the speaker 28 can output the audio to the operating room 15.
  • the speaker 28 may be provided either on the monitor 27 or in the main body of the operating room terminal 13, or it may be provided independently of the monitor 27 and the main body of the operating room terminal 13.
  • the teaching room 16 is provided with a teaching room terminal 14, an operation unit 35, a monitor 36, a camera 37, a speaker 38, and a microphone 44.
  • the teaching room terminal 14 is a computer used by a teacher who teaches the subject. The teacher includes a surgeon, a university professor, an authority in the medical field, a scholar, etc.
  • the teaching room terminal 14 is connected to the server 11 via the network 12, and is also connected to the operating room terminal 13 via the network 12. Two-way communication is possible between the teaching room terminal 14 and the server 11, two-way communication is possible between the teaching room terminal 14 and the operating room terminal 13, and two-way communication is possible between the server 11 and the teaching room terminal 14.
  • the teaching room terminal 14 is connected to the operation unit 35, the monitor 36, the camera 37, the speaker 38, and the microphone 44.
  • the teaching room terminal 14 may be a fixed terminal, a mobile terminal, or a portable terminal, or may be a unit that combines multiple of these terminals.
  • the teaching room terminal 14 has a control circuit 39, a memory unit 40, and a communication unit 41.
  • the control circuit 39 is mainly composed of a video card (also called a graphics board), and has an encoder and decoder that convert signals into data, an arithmetic circuit that performs various processing, comparisons, and calculations, a memory that temporarily stores information, etc.
  • the memory unit 40 may be either an internal type provided inside the main body constituting the teaching room terminal 14, or an external type provided outside the main body. Furthermore, an external memory unit 40 may be a hard disk, a DVD (Digital Versatile Disc), a stick memory, etc.
  • the memory unit 40 stores non-temporary programs, data, information, etc. used for processing and judgments performed by the control circuit 39.
  • the instruction room terminal 14 is connected to the operation unit 35, monitor 36, camera 37, microphone 44, and speaker 38 so as to be able to send and receive signals.
  • the communication unit 41 receives signals transmitted from the server 11 via the network 12, and also receives signals transmitted from the operating room terminal 13 via the network 12.
  • the signals received by the communication unit 41 are input to the control circuit 39.
  • the signals output from the communication unit 41 are transmitted to the server 11 via the network 12.
  • the signals input to and output from the communication unit 41 include information such as video, images, audio, and data.
  • the monitor 36 constructs a display screen 42 using a touch panel.
  • the touch panel may be of any of the following types: capacitive, resistive, and infrared.
  • the monitor 36 and the instruction room terminal 14 are connected by a communication line, for example, HDMI. Information signals including video, images, and audio are exchanged between the monitor 36 and the instruction room terminal 14.
  • the display screen 42 displays a display section 42A that displays the video of the surgical field camera 23 transmitted via the network 12, a display section 42B that displays the video captured by the first camera 24 transmitted via the network 12, a display section 42C that displays the video captured by the second camera 25 transmitted via the network 12, a display section 42D that displays the video captured by the camera 37, a dedicated menu 42E for adding annotations and pointers, and other operation menus.
  • the display screen 42 is compatible with high-resolution images.
  • the monitor 36 can also display evaluations of the instruction and annotations transmitted from the server 11, preferred modes, and the like.
  • the operation unit 35 includes a keyboard, a mouse, a touch panel display screen, a touch pen, etc.
  • the operation unit 35 is connected to the teaching room terminal 14.
  • a signal corresponding to the operation is input to the teaching room terminal 14.
  • the display screen of the operation unit 35 and the display screen of the monitor 36 may be provided separately or may be integrated.
  • FIG. 2B shows an example in which the display screen of the operation unit 35 and the display screen 42 of the monitor 36 are integrated.
  • the instructor can operate the operation menu and the dedicated menu 42E of the display screen 42 using a touch pen or a mouse, etc.
  • the instructor can also add annotations by drawing to the display unit 42A. The technical meaning of annotations will be described later.
  • Camera 37 converts the video captured inside teaching room 16 into a signal and outputs it.
  • Camera 37 is connected to teaching room terminal 14, for example, via HDMI.
  • Camera 37 may be installed in the teaching room in any manner, such as fixed, movable, or held by a staff member.
  • the microphone 44 is provided in the teaching room 16, and converts the sound in the teaching room 16 into a signal and outputs it.
  • the signal output from the microphone 44 is transmitted to the teaching room terminal 14.
  • the microphone 44 may be provided either in the camera 37 or independently of the camera 37.
  • the speaker 38 is connected to the teaching room terminal 14. When a signal including audio is output from the operating room terminal 13 and input to the teaching room terminal 14 via the server 11, the speaker 38 can output the audio from the operating room 15 in the teaching room 16.
  • the speaker 38 may be provided either on the monitor 36 or on the main body of the teaching room terminal 14, or it may be provided independently of the monitor 36 and the main body of the teaching room terminal 14. The audio from the speaker 38 can be switched by operating the operation unit 35.
  • the participation room terminal 43 is used in the participation room 116.
  • the participation room terminal 43 is a computer used by participants, such as trainees and young surgeons studying medical procedures in general. The medical procedure may be performed during surgery or not.
  • the participation room terminal 43 is connected to the operating room terminal 13 and the teaching room terminal 14 via the server 11.
  • the participation room terminal 43 can transmit a signal containing information to the operating room terminal 13 and can receive a signal containing information from the operating room terminal 13.
  • the participation room terminal 43 can transmit a signal containing information to the teaching room terminal 14 and can receive a signal containing information from the teaching room terminal 14.
  • the participation room terminal 43 may be a fixed terminal, a mobile terminal, or a portable terminal, or may be a unit in which a plurality of these terminals are combined.
  • the participation room terminal 43 has a control circuit 139, a memory unit 140, and a communication unit 141.
  • the control circuit 139 has an arithmetic circuit that performs various processes, comparisons, and calculations, a memory that temporarily stores information, etc.
  • the memory unit 140 stores non-temporary programs, data, information, etc. used for the processes and judgments performed by the control circuit 139.
  • the participation room terminal 43 is connected to the operation unit 135, monitor 136, camera 137, microphone 144, and speaker 138 so as to be able to send and receive signals.
  • the communication unit 141 receives signals transmitted from the server 11 via the network 12. The signals received by the communication unit 141 are input to the control circuit 139. Furthermore, the signals output from the communication unit 141 are transmitted to the server 11 via the network 12.
  • the signals input to and output from the communication unit 141 include various types of information, such as video, images, audio, data, folders, files, etc.
  • Monitor 136 creates a display screen using a touch panel.
  • the touch panel may be of any of the following types: capacitive, resistive, or infrared.
  • Monitor 136 and participation room terminal 43 are connected by a communication line, for example, HDMI. Signals containing various types of information are exchanged between monitor 136 and participation room terminal 43.
  • the operation unit 135 includes a keyboard, a mouse, a touch panel display screen, a touch pen, etc.
  • the operation unit 135 is connected to the participation room terminal 43.
  • a participant operates the operation unit 135, a signal corresponding to the operation is input to the participation room terminal 43.
  • the display screen of the operation unit 135 and the display screen of the monitor 136 may be provided separately or may be integrated.
  • the participation room terminal 43 acquires information including evaluation criteria necessary for evaluation from the server 11. By operating the operation unit 135, a participant can transmit an evaluation of the instruction and annotations acquired from the instruction room terminal 14 to the server 11. This evaluation can be transmitted for each surgery, each surgery content, each medical condition, each surgeon, and each participant.
  • the evaluation criteria may be any of a predetermined graded evaluation, an evaluation based on a predetermined fixed phrase, an evaluation independently set by the participant, and the like. Participants can also operate the operation unit 135 to transmit to the server 11 the desired mode for each surgery, each medical condition, each surgical procedure, each surgeon, and each participant.
  • the camera 137 converts the image captured in the participation room 116 into a signal and outputs it.
  • the camera 137 is connected to the participation room terminal 43, for example, via HDMI.
  • the camera 137 may be installed in the instruction room in any manner, such as a fixed type, a movable type, or a type held by a staff member.
  • the microphone 144 is installed in the participation room 116 and captures audio from the participation room 116.
  • the audio captured by the microphone 144 is processed by the participation room terminal 43.
  • the speaker 138 is connected to the participation room terminal 43.
  • a signal including audio is output from the participation room terminal 43, and the speaker 138 outputs the audio in the participation room 116.
  • the speaker 138 may be installed in either the monitor 136 or the main body of the participation room terminal 43, or may be installed independently of the monitor 136 and the main body of the participation room terminal 43.
  • the participation room terminal 43 also transmits a signal including information about the participation room terminal 43 to the server 11.
  • the information about the participation room terminal 43 includes an identification number unique to the participation room terminal 43, the participant's password, the participant's authentication ID, etc.
  • the server 11 processes the identification number unique to the participation room terminal 43, the participant's password, the participant's authentication ID, etc. Once the server 11 has processed the identification number unique to the participation room terminal 43, the participant's password, the participant's authentication ID, etc., mutual communication of information can take place between the server 11 and the participation room terminal 43.
  • the processing terminal 81 is an example of a computer having a function of processing medical fees for an instructor when the instructor's action is recognized as a medical action.
  • the processing terminal 81 has a control circuit 87, an operation unit 88, a display unit 89, a memory unit 90, and a communication unit 91.
  • the control circuit 87 has a function of processing the operation content of the operation unit 88 and a function of controlling the information displayed on the display unit 89.
  • the operation unit 88, the display unit 89, the memory unit 90, and the communication unit 91 are connected to the control circuit 87.
  • the display unit 89 is, for example, a monitor.
  • the operation unit 88 includes a keyboard, a mouse, a touch panel, etc.
  • the memory unit 90 stores software, programs, data, etc. used for the processing and judgment performed by the control circuit 87.
  • the communication unit 91 is connected to the server 11 via the network 12.
  • the processing terminal 81 can process medical fees for the instructor based on the instructor's act of remotely instructing the operating room terminal 13 using the instruction room terminal 14.
  • the processing terminal 81 can obtain auxiliary information generated by the auxiliary information processing unit 86 of the server 11 from the server 11.
  • the processing terminal 81 may also have a function of creating a database of the attendance history of seminars attended by young surgeons to obtain certification.
  • a display screen 34 of the monitor 27 can be divided and displayed on a plurality of display sections 34A, 34B, 34C, and 34D.
  • the image captured by the first camera 24 and displayed on the display section 34A is a 2D image and a 3D image.
  • the images displayed on display units 34A, 34B, 34C, and 34D include the positions of those involved in the surgery, their postures, the hand movements of those involved in the surgery, the placement positions of the instruments, the orientation of the instruments, the operating direction of the instruments relative to the patient's affected area, and the operating positions of the instruments relative to the patient's affected area.
  • Those involved in the surgery include the surgeon, nurses, assistants, etc.
  • the instruments include an operating table, imaging devices, diagnostic imaging devices, measuring devices, endoscopes, forceps, tweezers, scissors, scalpels, needle holders, etc.
  • the size and display position of the display units 34A, 34B, 34C, and 34D can be set as desired by operating the operation menu.
  • the images captured by the surgical field camera 23, the first camera 24, and the second camera 25, and the audio from the microphone 29 can be transmitted to the instruction room terminal 14 via the network 12.
  • the teaching room terminal 14 receives information including video, images, audio, and data.
  • Display sections 42A, 42B, 42C, and 42D can be displayed on the display screen 42 of the monitor 36.
  • the image on display section 42A is the same as the image on display section 34A
  • the image on display section 42B is the same as the image on display section 34B
  • the image on display section 42C is the same as the image on display section 34C
  • the image on display section 42D is the same as the image on display section 34D.
  • the images displayed on the display screen 42 are 2D and 3D images.
  • the presence or absence, size and display position of the display units 42A, 42B, 42C and 42D can be set by operating the operation menu of the display screen 42 or by operating the operation unit 35.
  • the speaker 38 in the instruction room 16 can output the sound transmitted from the operating room 15.
  • the instructor can add and display annotations and pointers by superimposing them on the image of the display unit 42A displayed on the display screen 42 by touching the tip of a stylus to the display screen 42.
  • Annotations are explanations, methods, procedures, positions, etc. related to the surgery, and can be mainly represented by a line drawing B1.
  • a pointer without a drawing B1 may also be displayed.
  • the dedicated menu 42E is operated when displaying the annotation of drawing B1 and the pointer on the image displayed on the display screen.
  • the dedicated menu 42E includes a timer (delay display), line thickness, rectangle, circle, snapshot, recording, chat, whiteboard, hand raising, line drawing priority control, line color, line thickness, type of fixed pattern represented by line, partial erasure, full erasure, etc.
  • the instructor can add drawing B1 to the image on the display unit 42A by adding a curved or straight line drawing B1, adding an arrow drawing B1, surrounding a specific location with a circular drawing B1 made of lines, etc.
  • the instructor can also give audio guidance. Audio guidance is collected by the microphone 44.
  • the drawing B1 on the display unit 42A includes the incision site in the patient's affected area, the incision method, the incision direction, the position where the tissue is pulled with the forceps, the pulling direction, how to create a dissection surface by pulling, various treatments, etc.
  • the audio instructions can be given in real time as a verbal explanation corresponding to the drawing B1, an explanation of technical meanings and reasons, comprehensive knowledge and ideas including the patient's QOL (Quality of life), etc., and an explanation of important instructions for the surgeon when performing surgery.
  • the instruction room terminal 14 transmits the drawing B1 provided on the display screen 42, the audio picked up by the microphone 44, and the video captured by the camera 37 to the operating room terminal 13 via the network 12.
  • the operating room terminal 13 displays the annotations and pointers of the drawing B1 transmitted from the instruction room terminal 14 as drawing B2, superimposed on the video on the display unit 34A, for example as shown in FIG. 2A.
  • the operating room terminal 13 also outputs the audio instructions transmitted from the instruction room terminal 14 from the speaker 28. Therefore, while surgery is being performed, the subject in the operating room 15 can receive surgery-related instructions from the instructor in real time.
  • the interface and compressed transmission technology in the remote instruction system 10 disclosed herein makes it possible to realize two-way communication. Therefore, appropriate instruction and explanations regarding the treatment to be performed in the operating room 15 can be transmitted in real time from the instruction room 16. This allows the treatment in the operating room 15 to proceed smoothly. Specifically, it makes it possible to apply the knowledge and experience of an experienced surgeon to provide instruction to a young surgeon by applying it to actual cases.
  • an instructor living in a rural area can instruct the subject without visiting the location of the operating room 15 (e.g., an urban area, etc.).
  • the person receiving instruction in the operating room 15 lives in a medically underserved area such as a remote area or island, and is a young surgeon responsible for medical care in that area, he or she can still receive instruction from the instructor in the instruction room 16.
  • surgeon can add annotations by drawing on the monitor 27.
  • the operation by the surgeon to add annotations on the monitor 27 is the same as the operation by the instructor to add annotations on the monitor 36.
  • a signal including the annotation added on the monitor 27 is transmitted from the operating room terminal 13 to the server 11.
  • the server 11 can transmit the annotation added on the monitor 27 to the instructor room terminal 14 and the participant room terminal 43.
  • participant can add annotations by drawing on the monitor 136.
  • the operation by which a participant adds annotations on the monitor 136 is the same as the operation by which an instructor adds annotations on the monitor 36.
  • a signal including the annotation added on the monitor 136 is transmitted from the participant room terminal 43 to the server 11.
  • the server 11 can transmit the annotation added on the monitor 136 to the instructor room terminal 14 and the surgery room terminal 13.
  • the second specific example is an example in which surgery is performed on a patient in an operating room, and surgery guidance is given in real time from a location away from the operating room. More specifically, this is an example in which surgery is performed on the patient using a medical robot system while the patient is lying on the operating table.
  • the operating room 15 is provided with a medical robot system 45, a first camera 24, a second camera 25, a microphone 29, an operating room terminal 13, a monitor 27, a speaker 28, and an operation unit 30.
  • the medical robot system 45 has an operating unit (surgeon console) 46, a vision cart 47, a robot body (patient cart) 48, and the like.
  • the operating unit 46 has a master control 49, a monitor 50, a microphone 51, a speaker 52, a chair, operation buttons, and the like.
  • the operating arm of the robot body 48 operates in accordance with the movement of the hand of the surgeon operating the master control 49.
  • the medical robot system 45 is connected to the operating room terminal 13 so as to be able to communicate bidirectionally.
  • the control unit 46 is operated by the surgeon sitting in a chair.
  • the monitor 50 is capable of displaying 3D images, 2D images, cross-sectional images, etc.
  • the cross-sectional images include CT (Computed Tomography) images and MRI (Magnetic Resonance Imaging) images obtained before surgery.
  • audio data from the microphone 51 is transmitted to the operating room terminal 13.
  • the speaker 52 outputs the audio transmitted from the operating room terminal 13.
  • the robot body 48 has multiple operating arms and a surgical field camera 53. Surgical instruments are attached to each of the multiple operating arms.
  • an operation signal is output from the control unit 46 and the robot body 48 functions.
  • An image of the patient's affected area captured by the surgical field camera 53 is transmitted to the operating room terminal 13.
  • an audio signal collected by the microphone 51 is transmitted to the operating room terminal 13.
  • the audio transmitted from the operating room terminal 13 is output from the speaker 38 via the network 12.
  • the vision cart 47 has a monitor 55 and a computer 56.
  • the computer 56 is connected to the operating room terminal 13, the control unit 46, and the robot body 48 so as to enable two-way communication, and is also connected to the monitor 55.
  • the computer 56 processes signals transmitted from the operating room terminal 13, processes signals transmitted from the control unit 46, and processes signals transmitted from the robot body 48.
  • the signals transmitted from the operating room terminal 13 to the computer 56 include information such as video and audio instructions, and annotations of the drawing B1.
  • the computer 56 also outputs signals for the image to be displayed on the monitor 50 of the control unit 46, audio signals to be output from the speaker 52 of the control unit 46, and images to be displayed on the monitor 55 of the vision cart 47.
  • the monitor 50 can display images from the surgical field camera 53, the first camera 24, and the second camera 25.
  • the monitor 50 can also display images from the camera 37 transmitted from the server 11.
  • the monitor 50 can switch between displaying 3D images and 2D images. Therefore, when the surgeon wants to view a 3D image, he does not need to move his line of sight.
  • the monitor 27 does not have to be provided. Furthermore, the other configurations shown in FIG. 3 are the same as the other configurations shown in FIG. 1. Furthermore, the computer 56 processes signals including the operation details and operation details of the control unit 46, and sends control signals to the robot body 48. Furthermore, the computer 56 processes signals including the operation details of the robot body 48 and signals including the operation details and operation details of the control unit 46, and transmits the processing results to the server 11 via the operating room terminal 13.
  • a signal including the annotation of drawing B1 and a signal including audio instruction are output from the instruction room terminal 14, and the signal is transmitted to the medical robot system 45 via the network 12 and the operating room terminal 13. Therefore, when surgery is performed using the medical robot system 45, the drawing B1 sent from the instruction room terminal 14 can be displayed as drawing B2 on the display unit 34A of the monitor 50, as shown in FIG. 2A.
  • FIG. 2A shows an example in which the display units 34A, 34B, 34C, and 34D are displayed on the display screen 34 of the monitor 50.
  • the audio transmitted from the instruction room terminal 14 is output from the speaker 52 of the control unit 46. Therefore, in the second specific example, the same effect as in the first specific example can be obtained.
  • the surgeon can add annotations by drawing on the monitors 50 and 55.
  • the operation of the surgeon adding annotations on the monitors 50 and 55 is the same as the operation of the instructor adding annotations on the monitor 36.
  • a signal including the annotations added on the monitors 50 and 55 is transmitted to the server 11 via the vision cart 47 and the operating room terminal 13.
  • the server 11 can transmit the annotations added on the monitors 50 and 55 to the instructor room terminal 14 and the participant room terminal 43.
  • the monitors 50 and 55 can display a display screen 34 that is the same as the display screen 34 of the monitor 27.
  • the display screen 34 of the monitors 50 and 55 can be switched between 2D and 3D images.
  • only annotations can be displayed on the display screen 34 of the monitors 50 and 55.
  • the process of displaying the drawing B1 and the pointer added to the 2D image or 3D image on the display unit 42A of the monitor 36 in the instruction room 16 as the drawing B2 and the pointer on the 3D image displayed on the monitors 27 and 55 in the operating room 15, for example, coordinate conversion, may be performed by either the control circuit 39 of the instruction room terminal 14 or the control circuit 31 of the operating room terminal 13.
  • the size of each of the display units 42A, 42B, 42C, and 42D displayed on the display screen 42 of the monitor 36 and the position of each of the display units 42A, 42B, 42C, and 42D can be arbitrarily changed by operating the operation unit 35 or the operation menu. For example, when the image of the operating room 15 is divided into a plurality of parts at predetermined time intervals in a chronological order, the size of the display units 42A and 42B can be made different according to each division.
  • the environment may be such that a single teaching room 16 is provided with a plurality of teaching room terminals 14, or such that a plurality of teaching rooms 16 are each provided with a teaching room terminal 14.
  • the plurality of teaching rooms 16 may be located in different locations.
  • the processing that can be performed on a specific teaching room terminal 14 may be different from the processing that can be performed on the other teaching room terminals 14. For example, processing such as playing, stopping, pausing, fast forwarding, and fast rewinding recorded video that is performed in the operating room 15 may be performed only on a specific teaching room terminal 14.
  • the multiple monitors 36 may be distinguished from each other by using an identification symbol or color coding for each different monitor 36.
  • a display process such as surrounding the displayed image of the corresponding monitor 36 with a colored frame may be performed to identify one of the multiple teaching room terminals 14.
  • the process of changing the color of the drawings displayed on the monitors 27, 50, and 55 of the operating room 15 for each of the multiple monitors 36 can be performed by either the control circuit 17 of the server 11 or the control circuit 31 of the operating room terminal 13. Also, when multiple monitors 36 are provided, the control circuit 17 of the server 11 can also perform a process of restricting the selection of the same drawing color on each monitor 36. Furthermore, the control circuit 31 of the operating room terminal 13 can also perform a process of displaying on the monitors 27, 50, and 55 of the operating room terminal 13 which of the multiple monitors 36 the drawing displayed on the monitors 27, 50, and 55 was created on.
  • control circuit 17 of the server 11 can also perform processing to distribute in real time or in recorded form via the network 12 the images displayed on the monitors 27, 50, 55 in the operating room 15 and the audio output from the speakers 28, 52 in the operating room 15 to a participating room terminal 43 different from the operating room terminal 13 and the instructor room terminal 14.
  • the participating room terminal 43 can receive 2D images, 3D images, and audio from the server, and can play back the images and audio.
  • the participating room terminal 43 does not transmit images or audio to the server 11.
  • the participating room terminal 43 includes fixed terminals, mobile terminals, portable terminals, goggle-type terminals, projectors, etc.
  • a display section T1 showing the time when the annotation signal of drawing B1 was sent from the teaching room terminal 14, a display section T2 showing the time when the signal returned to the teaching room terminal 14 via the server 11 after being sent to the operating room terminal 13, and a display section T3 showing the time required for one-way signal transmission between the teaching room terminal 14 and the operating room terminal 13 may be displayed in the corners of the display screen 42 of the monitor 36.
  • the subject can send recorded video of the surgical procedure from the operating room terminal 13 to the instruction room terminal 14 when the surgical procedure is not being performed.
  • the operating room terminal 13 does not need to be provided in the operating room 15.
  • the recorded video is stored in a folder in the memory unit 32 of the operating room terminal 13.
  • the folders that store the recorded video are stored in the memory unit 32 after being classified according to the part of the body, the case, the surgeon, the surgical procedure, the date and time, the operating room number (operation order), etc.
  • the recorded video can be sent from the operating room terminal 13 to the instruction room terminal 14 via the server 11, and the recorded video can be displayed on the display screen 42 of the monitor 36, allowing the subject to receive instruction from the instructor.
  • the recorded video can be shared between both the operating room terminal 13 and the instruction room terminal 14.
  • the instructor can annotate the drawing B1 on the display screen 42 of the monitor 36 and provide audio instruction in the same way as in a surgery performed in real time.
  • an operation menu (operation buttons) for playing, stopping, pausing, fast forwarding, rewinding, etc. of the recorded video is displayed on the monitor 27 in the operating room 15.
  • it may be configured so that these operations can be performed on at least one of the monitors 27, 50, 55 and the monitor 36.
  • the operating room terminal 13 may have permission to grant permission to the instructor room terminal 14 to perform the above operations from the operating room terminal 13.
  • FIG. 4 is a flowchart comprehensively illustrating the control examples described in the first specific example, the second specific example, and the application example.
  • Fig. 4 illustrates steps including processes and judgments performed by the server 11, the operating room terminal 13, the teaching room terminal 14, and the participating room terminal 43.
  • the server 11, the operating room terminal 13, the teaching room terminal 14, and the participating room terminal 43 can also execute each step in an order different from the order of each step shown in Fig. 4.
  • step S10 the operating room terminal 13 displays video of the procedure related to the surgery on at least one of the monitors 27, 50, and 55.
  • the operating room terminal 13 also transmits a signal including information about the operating room terminal 13 to the server 11.
  • the information about the operating room terminal 13 includes an identification number unique to the operating room terminal 13, the surgeon's password, the surgeon's authentication ID, etc.
  • the training room terminal 14 displays the image captured by the camera 37 on the monitor 36. Also, in step S11, the training room terminal 14 transmits a signal including information about the training room terminal 14 to the server 11.
  • the information about the training room terminal 14 includes an identification number unique to the training room terminal 14, the instructor's password, the instructor's authentication ID, etc.
  • the server 11 receives the signal transmitted from the operating room terminal 13 and the signal transmitted from the training room terminal 14 in step S12, and authenticates the operating room terminal 13 and the training room terminal 14.
  • the server 11 transmits the signal received from the operating room terminal 13 to the training room terminal 14, and transmits the signal received from the training room terminal 14 to the operating room terminal 13 in step S13.
  • the operating room terminal 13 acquires the information transmitted from the server 11 in step S14.
  • the teaching room terminal 14 acquires the information transmitted from the server 11 in step S15.
  • step S18 the operating room terminal 13 displays the image sent from the teaching room terminal 14 on at least one of the monitors 27, 50, 55, and transmits information including the image of the operating room 15 and information including the audio to the server 11.
  • step S19 the teaching room terminal 14 displays the image sent from the operating room terminal 13 on the monitor 36, and transmits information including the image of the teaching room 16 and information including the audio to the server 11.
  • step S20 the server 11 receives a signal sent from the operating room terminal 13 and stores it in the memory unit 22, and receives a signal sent from the teaching room terminal 14 and stores it in the memory unit 22.
  • step S21 the server 11 transmits a signal including the image and audio to the operating room terminal 13, and transmits a signal including the image and audio to the teaching room terminal 14.
  • the operating room terminal 13 receives the video sent from the server 11 and displays it on at least one of the monitors 27, 50, 55 in step S22.
  • the training room terminal 14 receives the video sent from the server 11 and displays it on the monitor 36 in step S23.
  • the instructor adds annotations to the video displayed on the monitor 36 in step S24, for example, the drawing B1 in FIG. 2(B).
  • the instructor also gives audio instruction from the microphone 44 in step S24.
  • the training room terminal 14 then processes the annotation signal of the added drawing B1 in step S24 and transmits a signal corresponding to the added annotation of the drawing B1 and a signal corresponding to the audio instruction to the server 11 in step S25.
  • the server 11 receives a signal from the instruction room terminal 14 in step S26, and transmits the signal to the operating room terminal 13 in step S27.
  • the operating room terminal 13 receives the signal from the server 11 in step S28.
  • the operating room terminal 13 then processes the received signal in step S29, and displays the annotation of the drawing B1 on at least one of the monitors 27, 50, 55 as shown in FIG. 2A. Additionally, audio instruction is output from at least one of the speakers 28, 52. Note that the operating room terminal 13 can add annotations in step S50 without acquiring information from the instruction room terminal 14.
  • step S20 the server 11 receives a signal including video and audio from the operating room 15, and also receives a signal including video and audio from the instruction room 16. Therefore, if the participating room terminal 43 is directly connected to the server 11 via the network 12 at the time of step S20, real-time distribution to the participating room terminal 43 can be performed.
  • the server 11 can also store the information received in step S20 in the storage unit 22. Therefore, if the participating room terminal 43 is connected to the server 11 after the server 11 performs the process of step S20, the user of the participating room terminal 43 can use the information stored in the storage unit 22 of the server 11 on a video-on-demand basis. Note that in a situation where surgery is not being performed in real time, the operating room terminal 13 may be connected to the server 11, and the information stored in the storage unit 22 may be used on a video-on-demand basis.
  • the video of the operating room 15 is recorded in the memory unit 22 of the server 11, or when the video of the operating room 15 is distributed LIVE to all participating room terminals 43 at the same time, it is possible to select the second control. Furthermore, although not shown in FIG. 4, it is also possible to select the first control and perform the processing of steps S18 to S28 between the operating room terminal 13 and the training room terminal 14 by bypassing the server 11. In this case, steps S20, S21, S26, and S27 in FIG. 4 are skipped. And, the programs for switching between the first control and the second control are stored in the memory unit 32 of the operating room terminal 13, the memory unit 22 of the control circuit 17 of the server 11, and the memory unit 40 of the training room terminal 14, respectively.
  • the server 11 can transmit various information to the participation room terminal 43 in step S27 of FIG. 4.
  • the various information sent by the server 11 to the participation room terminal 43 includes information acquired from the operating room terminal 13 and information acquired from the instruction room terminal 14.
  • the participation room terminal 43 can receive various information from the server 11 in step S30.
  • the participation room terminal 43 can display information received from the server 11, such as video and images, on the monitor 136 in step S30.
  • the participation room terminal 43 can output and display information received from the server 11, such as audio, on the speaker 138 in step S30.
  • step S31 the participating room terminal 43 can transmit various types of information to the server 11.
  • step S26 the server 11 can receive various types of information from the participating room terminal 43 and store the processing results of the various types of information in the storage unit 22.
  • step S13 the server 11 can transmit various types of information acquired from the participating room terminal 43 to the operating room terminal 13 and the instruction room terminal 14.
  • the server 11 stores various information acquired from the operating room terminal 13, the instruction room terminal 14, and the participating room terminal 43, and information sent to the operating room terminal 13, the instruction room terminal 14, and the participating room terminal 43 in a database in the memory unit 22, and performs a process of creating a history by updating the database.
  • necessary information including the date and time of the surgery, the name of the person in charge of the surgery, the name of the surgeon, the names of assistants in the operating room 15, the name of the instructor, the names of the participants, the name of the patient's disease and case, the surgical site on the patient, the surgical procedure, and the equipment used is databased to generate a briefing sheet, which is then stored in the memory unit 22.
  • the necessary information stored in the memory unit 22 is distinguished by a distinction symbol assigned to each surgery.
  • Pieces of information are inputted into the operating room terminal 13 or the teaching room terminal 14, and are transmitted from the operating room terminal 13 or the teaching room terminal 14 to the server 11.
  • various pieces of information generated during surgery such as videos, images, instructions, annotations, audio, the movements of the robot body 48, and the operation contents of the master control 49, are linked to the above-mentioned diacritics and stored in the memory unit 22. Therefore, in any of the operating room terminal 13, the teaching room terminal 14, and the participating room terminal 43, it is possible to search for the desired information by inputting the diacritics or by inputting part of the necessary information.
  • the operation contents of the master control 49 and information including the operation of the robot body 48 are associated with a time code that encodes the time information at which the annotation was generated, and are stored in the memory unit 22.
  • the control circuit 17 of the server 11 can classify various pieces of information for different instructors, different surgeons, different illnesses, different surgical procedures, different cases, and different parts of the patient's body, and store the information in a storage area such as the memory unit 22.
  • the control circuit 17 of the server 11 also has a function of generating learning data for instruction based on the information and data stored in the memory unit 22.
  • the created learning data is stored in the memory unit 22.
  • the learning data for instruction is used for instruction on performing surgery using the medical robot system 45.
  • the auxiliary information processing unit 86 of the control circuit 17 has a function of generating auxiliary information related to the medical fee of the instructor based on the information and data stored in the memory unit 22.
  • the auxiliary information for the medical fee includes the date of the remote instruction by the instructor, the elapsed time from the start to the end of the instruction, the content of the remote instruction by the instructor, etc.
  • the auxiliary information also includes points and scores calculated from the elapsed time from the start to the end of the instruction and the content of the remote instruction by the instructor.
  • the instruction includes the name of the patient's disease and case in the actual surgery, the surgical site on the patient, the surgical procedure, etc., as well as lectures to trainees and young surgeons.
  • the auxiliary information generated by the server 11 can be used by third parties. If remote instruction is covered by insurance, the certification information of young doctors generated by the server 11 can be used for the calculation of the medical fee received by the instructor.
  • each camera may be a 360-degree camera.
  • the monitor may be provided in VR (Virtual Reality) goggles worn on the face, or an HMD (Head Mounted Display) worn on the head.
  • the mobile terminal and the portable terminal include a smartphone and a tablet terminal.
  • the computers disclosed herein include personal computers, workstations, host computers, mainframes, supercomputers, etc.
  • the network connecting the server, the operating room terminal, and the teaching room terminal so as to enable two-way communication may be the Internet, an intranet, or an extranet.
  • the first location is a location where the operating room terminal 13 is provided, and the first location may be, for example, any of an operating room, a lecture room, a hall, a space, a room, etc.
  • the second location is a location where the instruction room terminal 14 is provided, and the second location may be, for example, any of an operating room, a lecture room, a hall, a waiting room, a remote instruction room, a space, a room, etc.
  • the second location is not limited to any one location, and may be multiple locations.
  • the computer possessed by the server may be a single electric component or electronic component, or may be a unit having multiple electric components or multiple electronic components.
  • the electric components or electronic components include a processor, a control circuit, and a module.
  • the computer possessed by the server may be either a single computer or multiple computers.
  • the communication unit and the memory unit may be separate components or separate devices communicably connected to the computer.
  • the remote instruction system 10 is configured not to use either the environment in which the participation room terminal 43 connects to the server 11 to receive real-time distribution, or the environment in which the participation room terminal 43 connects to the server 11 to use information stored in the storage unit 22 on a video-on-demand basis, steps S18, S19, S20, S21, S22, and S23 shown in FIG. 4 can be skipped.
  • the switching unit 105 has a proctor mode circuit 120, a demo mode circuit 121, and a full mode circuit 122.
  • the proctor mode circuit 120 activates a proctor mode (proctering mode)
  • the demo mode circuit 121 activates a demo mode (demonstration mode)
  • the full mode circuit 122 activates the full mode circuit.
  • the activation and deactivation of the proctor mode circuit 120, the demo mode circuit 121, and the full mode circuit 122 are switched by operating the teaching room terminal 14.
  • FIG. 5 shows an example of a screen displayed on the monitor 36 of the teaching room terminal 14.
  • a proctor mode icon 201 On the monitor 36, a proctor mode icon 201, a demo mode icon 202, a full mode icon 203, a participant screen 204, a main screen 205, and a sub screen 206 are displayed.
  • the proctor mode icon 201, the demo mode icon 202, and the full mode icon 203 generate a mode selection section.
  • the auxiliary information processing section 86 When the proctor mode icon 201, the demo mode icon 202, or the full mode icon 203 is operated, the auxiliary information processing section 86 generates a mode start request.
  • the proctor mode circuit 120 When the proctor mode icon 201 is operated, the proctor mode circuit 120 is activated. When the demo mode icon 202 is operated, the demo mode circuit 121 is activated. When the full mode icon 203 is operated, the full mode circuit 122 is activated.
  • the icon color of a mode that is activated is displayed darker than the icon color of an inactive mode.
  • active modes are displayed with a design that makes it clear that they are selected. This is to make it clear that the currently active mode is selected.
  • the main screen 205 is a screen that displays video and images acquired from the operating room 15.
  • the main screen 205 mainly displays video and images captured by the surgical field camera 23, or images of the inside of the body after incision with a scalpel.
  • the main screen 205 may also display academic conference materials or explanatory materials stored in the memory unit 32 of the operating room terminal 13.
  • the main screen 205 may also display materials stored on the cloud.
  • the sub-screen 206 is a screen that displays video and images captured by the first camera 24 and the second camera 25.
  • the proctor mode circuit 120 activates the proctor mode.
  • the proctor mode is a mode selected when, for example, an instructor in the instruction room 16, such as an academically certified doctor (expert doctor (surgeon)), provides surgical guidance to the surgeon in the operating room 15 and the participants in the participation room 116.
  • the demo mode circuit 121 activates the demo mode.
  • the demo mode is a mode in which, for example, an instructor, such as an academically certified doctor (expert doctor (surgeon)), provides guidance to the surgeon on how to improve his or her skills.
  • the full mode circuit 122 activates the full mode.
  • the full mode is a mode in which, for example, a conversation can be shared by everyone, including the instructor, surgeon, and participants.
  • proctor mode When proctor mode is activated, two-way communication of information including audio is possible between the operating room terminal 13 and the instruction room terminal 14. When proctor mode is activated, one-way communication of information including audio is possible from the operating room terminal 13 to the participating room terminal 43. When proctor mode is activated, one-way communication of information including audio is possible from the instruction room terminal 14 to the participating room terminal 43. When proctor mode is activated, information including audio cannot be sent from the participating room terminal 43 to the instruction room terminal 14, and information including audio cannot be sent from the participating room terminal 43 to the operating room terminal 13.
  • information including annotations can be transmitted from the instruction room terminal 14 to the operating room terminal 13, and information including annotations can be transmitted from the instruction room terminal 14 to the participating room terminal 43.
  • information including annotations cannot be transmitted from the participating room terminal 43 to the operating room terminal 13, and information including annotations cannot be transmitted from the participating room terminal 43 to the instruction room terminal 14.
  • demo mode When the demo mode is activated, two-way communication of information including audio is possible between the participating room terminal 43 and the teaching room terminal 14. When the demo mode is activated, one-way communication of information including audio is possible from the operating room terminal 13 to the teaching room terminal 14. When the demo mode is activated, information including audio cannot be transmitted from the teaching room terminal 14 to the operating room terminal 13.
  • demo mode when the demo mode is activated, information including annotations can be communicated bidirectionally between the instruction room terminal 14 and the participating room terminal 43.
  • demo mode when the demo mode is activated, information including annotations cannot be transmitted from the participating room terminal 43 to the operating room terminal 13, and information including annotations cannot be transmitted from the operating room terminal 13 to the participating room terminal 43.
  • FIG. 6 is an image diagram of mode switching.
  • proctor mode When proctor mode is activated, participant P1 participates by looking at monitor 136.
  • demo mode When demo mode is selected, participant P1 can receive an explanation about the surgery from instructor P2.
  • full mode When full mode is activated, participant P1 can participate in the surgery while watching the interaction between the surgeon and instructor.
  • Figure 7 is a conceptual diagram of the API (Application Programming Interface) when proctor mode is activated.
  • Figure 8 is a conceptual diagram of the API when demo mode is activated.
  • Figure 9 is a conceptual diagram of the API when full mode is activated.
  • FIG. 10 shows a mode change sequence in one embodiment of the present disclosure.
  • the mode icon on the monitor 36 of the teaching room terminal 14 is operated.
  • the instructor can operate the teaching room terminal 14 to select a mode in step S40.
  • the instructor can select a mode according to the first to third scenes. For example, the instructor can select demo mode to be used for the first scene. The instructor can select full mode to be used for the second scene. The instructor can select proctor mode to be used for the third scene.
  • the selected mode is transmitted from the teaching room terminal 14 to the server 11.
  • the server 11 receives a signal of the selected mode from the teaching room terminal 14 and stores it in the storage unit 22.
  • the switching unit 105 of the server 11 performs a process of activating the selected mode.
  • commands to perform control according to the activated mode are sent from the server 11 to the operating room terminal 13, the teaching room terminal 14, and the participating room terminal 43.
  • the operating room terminal 13 displays the activated mode on the monitor 27.
  • the monitor 55 of the vision cart 47 also displays the activated mode.
  • the monitor 50 of the control unit 46 also displays the activated mode.
  • the teaching room terminal 14 displays the activated mode on the monitor 36.
  • the participating room terminal 43 displays the activated mode on the monitor 136.
  • the instructor may also select each mode based on different conditions for the first, second, and third scenes.
  • An example is as follows:
  • the server 11 again obtains the mode preference from the operating room terminal 13 and the participating room terminal 43, and obtains an evaluation from the participating room terminal 43, in step S38 before the operation of step S40 is performed.
  • step S39 which is performed following step S38 and before step S40, the server 11 can recommend to the instruction room terminal 14 as an appropriate mode the mode in which the surgery proceeded most smoothly from among the data stored in the memory unit 22 during the first support process described above, depending on the content of the surgery, the case, the surgeon, etc.
  • the server 11 can recommend to the instruction room terminal 14 the mode that was selected at the time of the operation that received the highest evaluation among the second support processes described above, depending on the operation content, case, surgeon, participants, etc. Furthermore, in step S39, the server 11 can recommend to the instruction room terminal 14 the mode preference received from the operation room terminal 13 or the participant room terminal 43. Therefore, in step S40, the instructor can display the evaluation and the mode preference on the monitor 36 of the instruction room terminal 14. Also, in step S40, the instructor can operate the mode icon and select a mode, referring to the evaluation and the mode preference.
  • the main points are points to note for surgery in general.
  • the area where this remote instruction system 10 really shines is where it takes into account all the special cases of individual patients in individual cases of individual surgeons. This includes individual elements that cannot be corrected by passively watching the video, such as the habits and ways of thinking of each surgeon.
  • the present disclosure during surgery, it is possible to switch between the operating room terminal 13, the instructor room terminal 14, and the participant room terminal 43, with whom information including video, audio, and annotations can be exchanged. This makes it possible to select a mode that matches the compliance of live surgery determined by various academic societies.
  • the surgeon using the operating room terminal 13 can reduce the stress caused by the situation in the operating room being viewed from the outside, that is, by the instructor using the instructor room terminal 14 and the participants using the participant room terminal 43.
  • the instructor room terminal 14 can provide appropriate instruction and annotations to the operating room terminal 13 according to the surgery being performed by the surgeon.
  • the switching unit 105 of the server 11 of the present disclosure When the switching unit 105 of the server 11 of the present disclosure receives a mode activation request from the instruction room terminal 14 as to which mode to activate, it activates the requested mode. This allows the surgeon in the operating room to concentrate on the surgery, and the instructor can switch modes depending on the situation of the surgery.
  • the annotation processing unit 84 of the server 11 of the present disclosure associates information including the operation of the robot main body acquired by the robot information processing unit with annotation information generated by the instruction room terminal 14, and stores the information in the memory unit 22. This makes it possible to generate learning data to be used when instructing a young surgeon or an inexperienced surgeon on how to perform surgery using the robot main body 48.
  • the auxiliary information processing unit 86 of the present disclosure can generate auxiliary information used to process the medical fees of the instructor on the processing terminal 81.
  • the auxiliary information processing unit 86 can generate auxiliary information with characteristics that cause the medical fees to differ depending on the type of mode activated by the switching unit 105. Therefore, when the instructor's remote instruction is covered by insurance, the auxiliary information can be utilized, improving convenience for the user of the processing terminal 81.
  • the remote guidance system 10 is an example of an information processing system.
  • the server 11 is an example of an information processing device.
  • the monitors 27, 50, 55 are examples of display units provided in an operating room.
  • the microphones 29, 51 are examples of microphones that capture audio within the operating room.
  • the speakers 28, 52 are examples of speakers that output audio captured from outside the operating room.
  • the operating room terminal 13, the control unit 46, and the computer 56 are examples of operating room terminals.
  • Monitor 36 is an example of a display provided in the teaching room.
  • Microphone 44 is an example of a microphone that acquires audio from within the teaching room.
  • Speaker 38 is an example of a speaker that outputs audio acquired from outside the teaching room.
  • Teaching room terminal 14 is an example of a teaching room terminal.
  • Monitor 136 is an example of a display provided in the participation room.
  • Microphone 144 is an example of a microphone that acquires audio from within the participation room.
  • Speaker 138 is an example of a speaker that outputs audio acquired from outside the participation room.
  • Communication unit 20 is an example of a communication unit of an information processing device.
  • Memory unit 22 is an example of a memory unit.
  • Robot information processing unit 85 is an example of a robot information processing unit.
  • the proctor mode is an example of the first mode.
  • the demo mode is an example of the second mode.
  • the full mode is an example of the third mode.
  • the switching unit 105 can be defined as a switch or a switching circuit.
  • the video processing unit 82 can be defined as a video processor or a video processing circuit.
  • the audio processing unit 83 can be defined as an audio processor or an audio processing circuit.
  • the annotation processing unit 84 can be defined as an annotation processor or an annotation processing circuit.
  • the robot information processing unit 85 can be defined as a robot information processor or a robot information processing circuit.
  • the auxiliary information processing unit 86 can be defined as an auxiliary information processor or an auxiliary information processing circuit.
  • control unit 46 shown in FIG. 3 may be provided in a location different from the operating room 15 in which the robot body 48 is provided.
  • control unit 46 is connected to the computer 56 of the vision cart 47 via a network.
  • the network may be either wireless or wired.
  • the control circuit of the information processing device disclosed in this embodiment performs the following processes.
  • the control circuit activates the mode selected on the instruction room terminal, stores the activated mode in the memory unit, stores the number of annotations and the required time for annotations made from the instruction room terminal to the operating room terminal and the participating room terminal for each type of surgery performed in the operating room, each medical condition, each surgeon, and each participant in the memory unit, and determines which of the activated modes allowed the operation in the operating room to proceed most smoothly based on the number of annotations and the required time, and recommends the determined mode to the instruction room terminal.
  • control circuit performs the process of activating the mode selected on the instruction room terminal, acquiring from the participating room terminal an evaluation of the annotation sent from the instruction room terminal to the participating room terminal and storing the acquired evaluation in a memory unit, and recommending to the instruction room terminal the mode that was selected at the time the surgery that received the highest evaluation was performed.
  • control circuit performs a process of activating the mode selected on the teaching room terminal, and a process of acquiring the desired mode from at least one of the operating room terminal or the participating room terminal before the mode is selected on the teaching room terminal, and recommending the acquired desired mode to the teaching room terminal.
  • This disclosure can be used as an information processing device, information processing system, and information processing program that are connected to an operating room terminal, an instruction room terminal, and a participant room terminal via a network.

Landscapes

  • Medical Treatment And Welfare Office Work (AREA)

Abstract

L'objectif de la présente invention est de fournir un dispositif de traitement d'informations qui peut commuter des parties opposées entre un terminal de salle d'opération, un terminal de salle d'instruction et un terminal de salle de participation qui sont capables d'effectuer une communication mutuelle d'informations comprenant une vidéo, un discours et une annotation. À cet effet, un serveur (11) comprend un circuit de commande (17) qui est connecté à un terminal de salle d'opération (13), à un terminal de salle d'instruction (14) et à un terminal de salle de participation (43) par le biais d'un réseau (12), et qui exécute des processus. Le circuit de commande (17) exécute un processus sur une vidéo acquise, un processus sur le discours acquis, un processus sur une annotation acquise, ainsi qu'un processus permettant de commuter des états de transmission d'informations. Le circuit de commande (17) est configuré pour être activé par des modes de commutation entre un premier mode, un deuxième mode et un troisième mode différents dans l'état de transmission d'informations.
PCT/JP2023/037094 2022-10-26 2023-10-12 Dispositif de traitement d'informations, système de traitement d'informations et programme de traitement d'informations WO2024090228A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022171617 2022-10-26
JP2022-171617 2022-10-26

Publications (1)

Publication Number Publication Date
WO2024090228A1 true WO2024090228A1 (fr) 2024-05-02

Family

ID=90830655

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/037094 WO2024090228A1 (fr) 2022-10-26 2023-10-12 Dispositif de traitement d'informations, système de traitement d'informations et programme de traitement d'informations

Country Status (1)

Country Link
WO (1) WO2024090228A1 (fr)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007097653A (ja) * 2005-09-30 2007-04-19 Fujinon Corp 内視鏡診断システム
JP2012027565A (ja) * 2010-07-20 2012-02-09 Tryfor Co Ltd 救急患者治療支援システム
US9503681B1 (en) * 2015-05-29 2016-11-22 Purdue Research Foundation Simulated transparent display with augmented reality for remote collaboration
US20170105701A1 (en) * 2015-10-19 2017-04-20 Clarius Mobile Health Corp. Systems and methods for remote graphical feedback of ultrasound scanning technique
JP2019162339A (ja) * 2018-03-20 2019-09-26 ソニー株式会社 手術支援システムおよび表示方法
JP2020502552A (ja) * 2016-11-10 2020-01-23 シンク サージカル, インコーポレイテッド 遠隔指導ステーション
JP2022117596A (ja) * 2021-02-01 2022-08-12 キヤノンメディカルシステムズ株式会社 利用者マッチングシステム
WO2022176531A1 (fr) * 2021-02-19 2022-08-25 ソニーグループ株式会社 Système de gestion médicale, dispositif de gestion médicale et procédé de gestion médicale

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007097653A (ja) * 2005-09-30 2007-04-19 Fujinon Corp 内視鏡診断システム
JP2012027565A (ja) * 2010-07-20 2012-02-09 Tryfor Co Ltd 救急患者治療支援システム
US9503681B1 (en) * 2015-05-29 2016-11-22 Purdue Research Foundation Simulated transparent display with augmented reality for remote collaboration
US20170105701A1 (en) * 2015-10-19 2017-04-20 Clarius Mobile Health Corp. Systems and methods for remote graphical feedback of ultrasound scanning technique
JP2020502552A (ja) * 2016-11-10 2020-01-23 シンク サージカル, インコーポレイテッド 遠隔指導ステーション
JP2019162339A (ja) * 2018-03-20 2019-09-26 ソニー株式会社 手術支援システムおよび表示方法
JP2022117596A (ja) * 2021-02-01 2022-08-12 キヤノンメディカルシステムズ株式会社 利用者マッチングシステム
WO2022176531A1 (fr) * 2021-02-19 2022-08-25 ソニーグループ株式会社 Système de gestion médicale, dispositif de gestion médicale et procédé de gestion médicale

Similar Documents

Publication Publication Date Title
US11787060B2 (en) Remote presence system mounted to operating room hardware
Koo Training in lung cancer surgery through the metaverse, including extended reality, in the smart operating room of Seoul National University Bundang Hospital, Korea
Desselle et al. Augmented and virtual reality in surgery
JP4296278B2 (ja) 医療用コクピットシステム
KR101804320B1 (ko) 소프트웨어 모듈성, 프로젝터 및 레이저 포인터를 가진 텔레프레전스 로보트 시스템
US20200038119A1 (en) System and method for training and collaborating in a virtual environment
Parmanto et al. An integrated telehealth system for remote administration of an adult autism assessment
US20220293013A1 (en) Surgical training systems and methods
WO2018163600A1 (fr) Dispositif de gestion d'informations médicales, procédé de gestion d'informations médicales et système de gestion d'informations médicales
CN101595486A (zh) 交互式患者论坛
JP2004007539A (ja) ビジュアル情報の記録/再生方法、その装置及び通信システム
JPH11317936A (ja) ビデオ会議システム
WO2020152779A1 (fr) Système de rééducation et dispositif de traitement d'images pour un dysfonctionnement cérébral supérieur
Weiner et al. Expanding virtual reality to teach ultrasound skills to nurse practitioner students
CN114842704A (zh) 培训系统及培训方法
Stevenson Tertiary-level telehealth: A media space application
JP2008217293A (ja) 携帯端末を利用した医療教育ネットワークシステム
CA3205931A1 (fr) Systemes et procedes d'assistance a des procedures medicales
WO2024090228A1 (fr) Dispositif de traitement d'informations, système de traitement d'informations et programme de traitement d'informations
Garg et al. Virtual reality and its applications in interventional radiology
Stevenson et al. Human-centered evaluation for broadband tertiary outpatient telehealth: a case study
Rudowski Impact of information and communication technologies (ICT) on health care
US20220321925A1 (en) System and method for teaching a surgical procedure
CN217957202U (zh) 一种在线临床教学查房系统
Rebol et al. Evaluating Augmented Reality Communication: How Can We Teach Procedural Skill in AR?