WO2015114901A1 - 医療用動画記録再生システム及び医療用動画記録再生装置 - Google Patents
医療用動画記録再生システム及び医療用動画記録再生装置 Download PDFInfo
- Publication number
- WO2015114901A1 WO2015114901A1 PCT/JP2014/078975 JP2014078975W WO2015114901A1 WO 2015114901 A1 WO2015114901 A1 WO 2015114901A1 JP 2014078975 W JP2014078975 W JP 2014078975W WO 2015114901 A1 WO2015114901 A1 WO 2015114901A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- moving image
- unit
- past
- feature point
- recording
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/20—ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00045—Display arrangement
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00045—Display arrangement
- A61B1/0005—Display arrangement combining images e.g. side-by-side, superimposed or tiled
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/012—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor characterised by internal passages or accessories therefor
- A61B1/018—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor characterised by internal passages or accessories therefor for receiving instruments
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/66—Remote control of cameras or camera parts, e.g. by remote control devices
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/555—Constructional details for picking-up images in sites, inaccessible due to their dimensions or hazardous conditions, e.g. endoscopes or borescopes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
- H04N23/81—Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation
- H04N23/811—Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation by dust removal, e.g. from surfaces of the image sensor or processing of the image signal output by the electronic image sensor
Definitions
- the present invention relates to a medical moving image recording / reproducing system and a medical moving image recording / reproducing apparatus for recording a medical moving image captured by a medical device and reproducing the recorded moving image or a moving image being captured.
- an image of the affected part is searched from the endoscopic images taken in the past examination, and the still image is PinP (picture-in-picture). )it's shown.
- the endoscope video recorded by the VTR is displayed in PinP.
- the surgeon refers to the still image of the affected area displayed in PinP and past endoscopic images, finds the corresponding part in this examination, and diagnoses by seeing how the affected area has changed compared to the previous time Etc.
- Patent Document 1 For medical image recording devices that can display and play back useful videos with simple operations in a short period of time, feature points were detected from the acquired medical videos, and feature points were detected during playback of the videos.
- a technique for reproducing a moving image from a point in time is disclosed (for example, Patent Document 1).
- reproduction is performed from a location where a feature point is detected from past moving images. Since the video from the start of the examination is not reproduced, it is difficult for the operator to grasp the insertion path, and it may take time to find the affected part in the endoscopic examination.
- An object of the present invention is to provide a technique that has high operability and makes it easy to grasp an insertion path to an affected area in an endoscopic examination.
- a medical video recording / playback system that records and plays back medical video data, an imaging unit that captures an image of a subject and acquires an imaging signal, and the imaging signal
- a moving image data generating unit that generates moving image data of the subject; and feature point information indicating that the image is characterized as an image of the subject with respect to a specified frame among frames constituting the moving image data.
- a feature point information generation unit to be generated; a recording unit that records the feature point information and a corresponding frame of the moving image data in association with each other; and at least one past data based on the one or more moving image data read from the recording unit
- a reproduction control unit for controlling reproduction of a moving image, live moving image data generated by the moving image data generation unit, and past moving image data provided from the reproduction control unit Generating composite image data that is synthesized so that a live video based on the live video data and a past video based on the past video data are displayed at the same time, and outputting the resultant composite image data
- a display unit that displays a synthesized image based on the synthesized image data output from the synthesizing unit, and the reproduction control unit is configured to display the frame of the past moving image data to which the feature point information is attached. The control is such that playback is paused.
- a medical video recording / reproducing apparatus that records and reproduces medical video data, an imaging unit that images a subject and acquires an imaging signal, and the imaging Based on the signal, a moving image data generation unit that generates moving image data of the subject, and an image having a characteristic as the image of the subject with respect to a specified frame among frames constituting the moving image data At least one past moving image based on one or more moving image data read from a feature point information generating unit that generates feature point information and a recording unit that records the feature point information and the corresponding frame of the moving image data in association with each other
- a playback control unit that controls playback of the video, live video data generated in the video data generation unit, and past video data provided from the playback control unit,
- a synthesis unit that generates synthesized image data that is synthesized so that a live movie based on live movie data and a past movie based on the past movie data are displayed simultaneously, and outputs the obtained synthesized image data to a display unit
- the present invention in endoscopy, it has high operability and makes it easy for the surgeon to grasp the insertion path to the affected area.
- 1 is an overall configuration diagram of a medical moving image recording / reproducing system.
- 1 is an overall block diagram of a medical video recording / playback system. It is a figure which illustrates the synthesized image displayed on the monitor based on the synthesized image data obtained by synthesize
- FIG. 1 is an overall configuration diagram of a medical moving image recording / playback system according to the present embodiment.
- a medical video recording / playback system 100 shown in FIG. 1 includes an endoscope observation apparatus 1, a monitor 2, a scope 3, an image filing server (hereinafter abbreviated as a server) 4, a keyboard 5, a wireless LAN (Local Area Network) 6, and a tablet.
- a PC Personal Computer
- the medical moving image recording / reproducing system 100 in FIG. 1 obtains moving image data by performing necessary processing in the endoscope observation apparatus 1 on the imaging signal acquired by the scope 3 in the endoscopic examination.
- the obtained moving image data is recorded in the server 4.
- the moving image data recorded in the server 4 can be read out, processed by the video processor of the endoscope observation apparatus 1, and reproduced on the monitor 2.
- Input and setting of various information necessary for endoscopic examination, recording and reproduction of moving image data can be performed via various input means such as the operation switch of the scope 3, the keyboard 5, and the tablet PC 7.
- the endoscope observation device 1 is a device in which devices such as a video processor, a light source device, and a monitor 2 are assembled in a trolley shape, and is acquired by the scope 3. Necessary processing is performed on the medical video, and the output is displayed on the monitor 2.
- the distal end of the scope 3 is inserted into the body cavity of the subject, and the acquired imaging signal in the body cavity is output to the endoscope observation apparatus 1.
- the monitor 2 receives moving image data obtained by processing the imaging signal by the video processor of the endoscope observation apparatus 1 from the endoscope observation apparatus 1 and displays the moving image on the screen.
- the server 4 receives the moving image data obtained by image processing in the endoscope observation apparatus 1 via the wireless LAN router 6, and records the received moving image data.
- the server 4 records patient information for identifying a subject, that is, a patient, and feature point information indicating a frame image specified by an operator or the like in the moving image data in association with the moving image data. Details of the feature point information will be described later in detail with reference to FIG.
- the endoscope observation apparatus 1, the server 4, and the tablet PC 7 are configured to be connected to each other via a wireless LAN. It is good also as a structure connected by LAN. Moreover, the network between each apparatus is not limited to LAN, You may use various well-known networks.
- the keyboard 5 is one of input means for inputting various settings and instructions to the connected endoscope observation apparatus 1.
- Settings and instructions input from the keyboard 5 include settings and instructions related to endoscopy, instructions related to playback of moving picture data on the monitor 2 such as playback and stop of moving picture data, and instructions such as addition of feature point information. including.
- the tablet PC 7 is one of input means for inputting an instruction regarding reproduction of the moving image data on the monitor 2 and an instruction such as addition of feature point information.
- the endoscope observation apparatus 1 stores the information on the patient 4 identifying the patient to be examined from now on the server 4. It is confirmed whether moving image data with matching patient information has already been recorded.
- the moving image data is transmitted to the endoscope observation apparatus 1 via the wireless LAN router 6.
- the endoscope observation apparatus 1 displays a moving image received from the scope 3 live on the monitor 2 and monitors a past moving image together with the moving image data received from the server 4 and stored in the memory or the like of the own device. 2 is displayed. On the monitor 2, a past moving image is reproduced and a live moving image being captured by the scope 3 is displayed. The surgeon inserts the scope 3 into the body cavity of the patient, determines a place that needs to be examined, that is, a place where an affected area or the like appears in the past moving picture, and observes it.
- Operations such as playing, stopping, fast-forwarding and rewinding of moving images are performed by operating switches located at the hands of the operator of the scope 3, a foot switch not shown in FIG. 1, a keyboard operated by a staff in an unclean area, etc. 5 is configured to be able to be performed through the network.
- the configuration can be made via the tablet PC 7 or the like.
- the operation for assigning the feature point information to the moving image data can also be performed using these means.
- the medical moving image recording / reproducing system 100 in endoscopic examination, a frame image in which an affected part or the like appears and the above-described feature point information are recorded in association with each other.
- the past video is read out from the server 4 and displayed on the monitor 2 and displayed, and the past video is paused at the frame image portion to which the feature point information is added.
- FIG. 2 is an overall block diagram of the medical video recording / playback system 100 according to the present embodiment.
- the keyboard 5 and the wireless LAN router 6 of FIG. 1 are omitted as a configuration of the medical moving image recording / playback system 100, and a foot switch 8 and a USB (Universal Serial ⁇ Bus) memory 9 are added.
- FIG. 2 only the configuration related to the method of recording and playing back medical moving images according to the present embodiment is shown, and the description of other configurations is omitted.
- the scope 3 includes an imaging unit 31, a scope switch 32, and a treatment instrument insertion detection unit 33.
- the imaging unit 31 includes a lens and a CCD (Charge Coupled Device), and acquires an imaging signal of the subject.
- the scope switch 32 is disposed in an operation unit used by a surgeon at hand, and the video processor 50 of the endoscope observation apparatus 1 instructs various operations of endoscopy and gives feature point information of moving images during imaging. Give against.
- the treatment instrument insertion detection unit 38 includes, for example, a sensor provided in a forceps hole provided at the distal end of the scope 3, and the sensor detects that an operator who is a user uses the forceps through the forceps hole. The method of using the detection result of the treatment instrument insertion detection unit 38 will be described in detail in the description of the first modification.
- the foot switch 8 is used to input various operations related to operation of the scope 3 in endoscopy and surgery, reproduction of moving images, and addition of feature point information.
- the tablet PC 7 has a voice input reception unit 71 and a voice input transmission unit 72.
- the voice input transmitting unit 72 transmits the voice data input to the tablet PC 7 toward the video processor 50.
- the video processor 50 receives the audio data transmitted from the audio input transmission unit 72 of the tablet PC 7 through the tablet communication unit 22.
- the video processor 50 that has received the audio data acquires necessary information by analyzing the audio data in an analysis unit (not shown in FIG. 2), and passes the acquired information to each unit constituting the video processor 50. give. Further, the communication between the tablet PC 7 and the video processor 50 is not limited to the audio data communication illustrated in FIG.
- the video processor 50 of the endoscope observation apparatus 1 includes an image processing unit 11, a synthesizing unit 12, a recording moving image data generation unit 13, a moving image recording area 14, a decoder 15, a reproduction control unit 16, a trigger identification unit 17, and trigger type information.
- An internal memory 26 is included.
- the video processor 50 performs necessary processing on the imaging signal obtained by imaging with the imaging unit 31 of the scope 3 to obtain moving image data, and outputs the moving image data to the monitor 2.
- the image processing unit 11 performs necessary image processing on the imaging signal input from the scope 3 to obtain moving image data.
- the synthesizing unit 12 converts the moving image data input from the image processing unit 11, that is, the data of the live moving image, and the past moving image data read from the server 4 or the USB memory 9 into the live moving image and the past moving image based on the live moving image data.
- Composite image data that is synthesized so that the past moving image based on the data is displayed on the monitor 2 at the same time is generated and transmitted to the monitor 2.
- the recording moving image data generation unit 13 is composed of, for example, an encoder, and performs necessary processing such as encoding on the moving image data input from the image processing unit 13.
- necessary processing such as encoding
- various information such as a frame number, a time stamp, and an insertion length of the scope 3 may be recorded in association with the moving image data for recording.
- the moving image recording area 14 is an area for temporarily storing moving image data for recording generated by the recording moving image data generation unit 13 or moving image data read from the server 4 or the USB memory 9.
- the decoder 15 decodes the moving image data held in the moving image recording area 14.
- the decoded moving image data is given to the reproduction control unit 16.
- the reproduction control unit 16 controls the reproduction of past moving images based on the moving image data decoded by the decoder 15 and the feature point information associated therewith.
- the reproduction control unit 16 outputs the past moving image data for which reproduction is controlled, together with the feature point information, to the synthesis unit 12.
- the trigger identifying unit 17 recognizes a user operation that becomes a trigger when the feature point information is associated with the corresponding frame image.
- the user's operation as a trigger is an operation of the scope switch 32, the foot switch 8, the tablet PC 7, and the keyboard 5 in FIG. 1 (not shown in FIG. 2). Communication with the tablet PC 7 is performed via the tablet communication unit 22 as described above.
- the trigger type information generation unit 18 generates information indicating by which means the user operation recognized by the trigger identification unit 17 is input.
- the trigger type information includes, for example, the scope 3 (the scope switch 32 and the treatment instrument insertion detection unit 33), the foot switch 8, the tablet PC 7, the keyboard 5, and the like.
- the frame number data generation unit 19 generates data representing the frame number of the corresponding frame image when an operation for assigning feature point information is performed from the scope 3 or the like.
- the time stamp generation unit 20 detects the date and time when the operation for adding feature point information is performed, and generates a time stamp.
- the insertion length data generation unit 21 generates insertion length data of the scope 3 based on the insertion length of the scope 3 at the timing when the feature point information addition operation detected by the insertion length detection unit 23 is performed.
- a known technique is used as the technique for generating insertion length data from the detection result of the insertion length detector 23 in the insertion length data generator 21 a known technique is used. A specific method of using the insertion length data will be described in detail in the description of the second modification.
- the feature point information generation unit 24 generates feature point information from data input from the trigger type information generation unit 18, the frame number data generation unit 19, the time stamp generation unit 20, and the insertion length data generation unit 21.
- the feature point information includes the frame number of the frame image, the time stamp, the insertion length of the scope 3 and the like corresponding to the timing when the operator or other user of the medical moving image recording / playback system 100 operates the scope switch 32 or the like.
- the feature point information may include other information, or may include a part of the information.
- the frame image to which the feature point information is added indicates that it is a frame image in which an affected area or the like appears from the moving image specified by the user.
- the feature point information recording area 25 is an area for storing feature point information generated in the video processor 50.
- shooting conditions such as an observation mode (normal light observation mode, narrow-band light observation mode, fluorescence observation mode, infrared light observation mode, etc.) are also stored in the feature point information recording area 25.
- the video processor 50 records information such as feature point information in the feature point information recording area 25 in the server 4 and the USB memory 9 in association with the moving picture data held in the moving picture recording area 14.
- the playback control unit 16 refers to the frame number of the feature point information, and pauses playback in the frame of the frame number corresponding to the feature point information in the past moving image data.
- the synthesizing unit 12 generates synthesized image data to be displayed on the monitor 2 based on the live moving image data input from the image processing unit 11 and the past moving image data input from the reproduction control unit 16.
- the generated composite image data is output.
- composite image data in a state where the past moving image is stationary in a frame attached with feature point information is generated and output to the monitor 2.
- the monitor 2 displays a composite image based on the composite image data received from the composition unit 12.
- FIG. 3 is a diagram illustrating a composite image displayed on the monitor 2 based on the composite image data obtained by the composition unit 12.
- the live video P1 and the past video P2 are displayed side by side on the monitor 2. Further, as shown in FIG. 3, a progress bar PB may be displayed beside the past video P2.
- the progress bar PB in FIG. 3 shows the relative position in the video at the timing at which the feature point information is given in the period from the start to the end of the video of the past endoscopy, and the start of the video.
- the elapsed time is displayed.
- the shooting time T of the past movie P2 is 20 minutes
- the feature point information f1, f2, and f3 is assigned to the movie P2.
- the feature point information f1, f2, and f3 is visually given to the user, such as a surgeon, at the timing of “5 minutes”, “7 minutes”, and “12 minutes” from the start time of the moving image. It is displayed so that it can be easily grasped.
- the surgeon refers to the progress bar PB, inserts the scope 3 and starts the examination (starts shooting of the moving image), and then takes the time required to the location where the feature point information is given, that is, the affected area Recognize etc.
- the past moving image P2 is temporarily stopped in the frame to which the feature point information f1 to f3 is assigned.
- the surgeon determines the position of the affected part with reference to the progress bar PB, and compares the frame image of the past affected part to which the feature point information f1 to f3 is given with the live moving image to perform a detailed examination.
- the video processor 50 in FIG. 2 records the past moving image data combined with the live moving image in the combining unit 12 in the server 4 or the USB memory 9 in association with the feature point information. Keep going.
- the server 4 includes a moving image recording area 41, a feature point information recording area 42, and a feature point information generating unit 43.
- the moving image recording area 41 is an area for recording moving image data received from the video processor 50 of the endoscope observation apparatus 1.
- the feature point information recording area 42 is an area for recording the feature point information received together with the moving image data from the video processor 50.
- the operation of the feature point information generation unit 43 is the same as that of the feature point information generation unit 24 of the video processor 50.
- the feature point information generation unit 43 generates feature point information from information input directly to the server 4 or indirectly via a network.
- the USB memory 9 has a moving image recording area 91 and a feature point information recording area 92.
- the moving image recording area 91 and the feature point information recording area 92 are areas for holding moving image data and feature point information, respectively, similarly to the moving image recording area 41 and the feature point information recording area 42 of the server 4.
- the processor internal memory 26 is storage means provided in the processor 50. As shown in FIG. 2, the processor internal memory 26 has a moving image recording area 27 and a feature point information recording area 28. Similar to the server 4 and the USB memory 9, the moving image recording area 27 and the feature point information recording area 28 are areas provided to hold moving image data and feature point information, respectively.
- the video processor 50 of the endoscope observation apparatus 1 is the same as the live moving image, that is, the image being captured in the endoscopy in the past.
- the patient (information) examination video is displayed on the monitor 2.
- the past moving image is reproduced from the start of the inspection, and is temporarily stopped at the location where the feature point information is given.
- the surgeon can easily grasp the insertion path of the scope 3 to the affected area without having to ask the staff in the unclean area to temporarily stop at the affected area or the like to which the feature point information has been given in the past examination. Can be inspected.
- the medical video recording and playback method according to the present embodiment is not limited to this. It is not something. For example, it is very good to display live video and past video on separate monitors. According to this, the user can arbitrarily determine the arrangement of each monitor according to the usage environment of the monitor 2 or the like.
- the feature point information is given through the operation switch or the like of the scope 3 or through the tablet PC 7 capable of wireless communication via the keyboard 5 or the wireless LAN router 6 connected to the endoscope observation apparatus 1.
- the treatment instrument insertion detection unit 33 in FIG. 2 recognizes that a treatment instrument such as a forceps has been inserted, and determines that this is a trigger for adding feature point information.
- the video processor 50 of the endoscope observation apparatus 1 detects that a treatment tool such as forceps is inserted through the forceps hole of the scope 3, the video processor 50 determines that there is an affected part at the position of the scope 3 when the insertion is detected. To do. Then, feature point information is given to the frame image at the time when the insertion of the forceps is detected.
- the feature point information includes the insertion length of the scope 3.
- the video processor 50 calculates the frame number corresponding to the insertion length designated by the user from the scope 3 insertion length data of the plurality of feature point information, and starts the playback of the past moving image therefrom. .
- FIG. 3 a configuration is adopted in which a live video and a past video are displayed side by side on the monitor 2.
- the display method on the monitor 2 in consideration of convenience when a user such as an operator performs an examination while comparing a live video with a past video. Take.
- a method for displaying a moving image on the monitor 2 by the medical moving image recording / reproducing system 100 according to the present embodiment will be described in detail with reference to FIGS.
- FIG. 4 is a diagram illustrating detailed blocks of the synthesis unit 12 in the video processor 50 of the endoscope observation apparatus 1. With reference to FIG. 4, first, a specific description will be given of how to generate a composite image to be displayed on the monitor 2 by combining a live video and a past video.
- the synthesis unit 12 includes a live observation character data generation unit 51, a live observation character data superimposition unit 52, a live observation video required display frame superposition unit 53, a character data superposition position setting unit 54, and a recorded video.
- Character required data generating unit 55, shooting condition identifying unit 56, recorded moving image required character data superimposing unit 57, display frame shape setting unit 58, recorded moving image display frame superimposing unit 59, adding unit 60, display content storage area 61, display content A selection unit 62 and a display content selection type storage area 63 are provided.
- the character data superimposition position setting unit 54 displays the live moving image and the past moving image monitor 2 from the live moving image data input from the image processing unit 11 and the past moving image data input from the reproduction control unit 16 in FIG. Sometimes, the position of a character or the like indicating whether each moving image is a live moving image or a past moving image is set. The positions of these characters and the like are determined based on the contents set by the user via the tablet PC 7 and the keyboard 5 shown in FIG.
- the live observation character data generation unit 51 generates character data representing a live video out of two videos displayed on the monitor 2.
- the live observation character data superimposing unit 52 superimposes the character data generated by the live observation character data generating unit 51 on the live video data input from the image processing unit 11.
- the live observation moving image required display frame superimposing unit 53 further superimposes a live moving image display frame on the data obtained by superimposing the character data on the live moving image in the live observation character data superimposing unit 52.
- the shooting condition identification unit 56 acquires the shooting condition of the moving image from the information input from the reproduction control unit 16.
- imaging conditions include observation modes, that is, depending on any of normal light observation or special light observation (narrowband light observation, fluorescence observation, infrared light observation, etc.), and the like.
- the recorded moving image character data generation unit 55 is a past movie of the two moving pictures to be displayed on the monitor 2 and character data for displaying the shooting condition.
- the recorded moving image character data superimposing unit 57 superimposes the character data generated by the recorded moving image character data generating unit 55 on the past moving image data input from the reproduction control unit 16.
- the recorded moving image display frame superimposing unit 59 further superimposes a past moving image display frame on the data obtained by superimposing the character data on the past moving image in the recorded moving image character data superimposing unit 57.
- a shape or the like different from the display frame for the live moving image is set so that the user can easily distinguish the display frame from the live moving image on the monitor 2.
- the display frame shape setting unit 58 sets the display frames of past moving images according to the display mode set by the user via the input means such as the tablet PC 7 or the keyboard.
- the recorded moving image display frame superimposing unit 59 superimposes the past moving image display frame set in the display frame shape setting unit 58 on the past moving image and character data.
- Composite image data to be displayed on the monitor 2 by arranging the character data and the data obtained by superimposing the display frames on the live moving image data and the past moving image data at predetermined positions on the screen in the adding unit 60. Is output to the monitor 2.
- the user determines what and how to use as characters and display frames to be superimposed on live moving image data and past moving image data. Desired characters and display frames can be selected.
- the display content selection unit 62 receives the content selected by the user for these characters and display frames via the various input means such as the tablet PC 7.
- the display content selection type storage area 63 is an area for storing various displayable characters and display frame types that can be provided by the medical moving image recording / playback system 100.
- the display content storage area 63 is an area for storing a character selected by the user from various types of characters and display frames stored in the display content selection type storage area 63.
- display mode the type of display form such as characters and display frames that can be provided to the user by the medical moving image recording / playback system 100 is referred to as “display mode”.
- FIG. 5 is a diagram illustrating a display mode selection screen.
- FIG. 5A illustrates a display mode setting change screen W1
- FIG. 5B illustrates a screen W2 that displays a list of display modes that can be set by the user.
- the video processor 50 of the endoscope observation apparatus 1 displays a display when the user selects “image display selection mode” on the display mode setting change screen W1 of FIG. In accordance with the information stored in the content storage area 61, the display mode list screen W2 of FIG.
- the video processor 50 stores the display content selection type information on characters and display frames to be displayed on the monitor 2 in the normal mode.
- the display is switched to the screen illustrated in FIG. 3, for example.
- the display mode list screen W2 in FIG. 5B illustrates eight display modes.
- the user selects one display mode from the screen W2 in FIG.
- the synthesizer 12 holds the character and display frame information corresponding to the selected display mode in the display content selection type storage area 63, and displays the moving image on the monitor 2 thereafter.
- the screen W2 may be a sample screen for each display mode, and may be set to the display mode selected by the user. Specific examples of the display mode are shown in FIGS.
- FIG. 6A displays characters “LIVE” and “PAST (AFI mode)” indicating that the moving image is a live moving image and a past moving image
- FIG. 6A displays characters in the upper part of the display frame
- FIG. 6B displays characters in the lower part of the display frame.
- characters indicating that the past moving image observation mode is the AFI (fluorescence observation) mode are also displayed together with the characters indicating the past moving image.
- the display modes of FIG. 6C and FIG. 6D indicate characters indicating that the moving image is a live moving image and a past moving image outside the display frame of each moving image.
- 6C displays characters in the upper part of the vicinity of the frame outside the display frame
- FIG. 6D displays characters in the lower part of the vicinity of the frame outside the display frame.
- the display frame of the past video is set as the display frame of the live video so that the user does not mistake the live video on the monitor 2 and the past video.
- Different colors and shapes FIG. 6 (e) encloses the past moving image display frame with a double rectangle, and FIG. 6 (f) shows the past moving image display frame as a circle, thereby forming a single rectangular display frame for the live moving image. It has a different shape.
- FIG. 6G shows a case where three past moving images are displayed together with a live moving image. Similar to FIG. 6G, the past moving images are displayed relatively smaller than the live moving images. The display frame of the past moving image is circular so that the user can determine the past moving image and the past. Similarly to FIG. 6G, the shooting conditions of past moving images are also displayed.
- FIG. 6 illustrates the case where the observation mode is displayed as characters as shooting conditions for past moving images, but the shooting conditions displayed as characters are not limited thereto.
- information for identifying a scope information such as an enlargement magnification may be displayed. Whether or not to display these photographing conditions and whether to display any information can be configured to allow the user to select and display based on this.
- the user can operate the touch panel monitor or the tablet PC 7 to move the position of the character displayed on the screen from the default position. It is good also as a structure which can be changed by etc. According to this, for example, from the display mode in which characters are displayed in the upper part in the display frame in FIG. 6A, the display mode in which characters are displayed in the lower part in the display frame as shown in FIG. The display mode is changed to a display mode in which characters are displayed near the frame outside the display frame in FIG.
- information recorded in association with the moving image data is read from the past moving image data from the server 4, the USB memory 9, or the video processor 50 and displayed.
- composition unit 12 of the video processor 50 may be configured to change the display form according to the state of the moving image. This will be described with reference to FIG. FIG. 7 is a diagram illustrating an example of changing the display form on the monitor 2.
- the image itself may be dark depending on the observation mode and the observation position in the body cavity.
- the display mode for displaying the above-described various characters in the display frame is set, when the image becomes dark, it may be difficult for the user to read the characters in the display frame. Therefore, by comparing the color value of the shaded area in FIG. 7 with a predetermined threshold value, it is determined that the color value of the image in the shaded area is close to the color value of the character, and the character color is set to, for example, white. It is good also as a structure to switch. In this way, the character color is automatically switched in real time so that the contrast between the character color and the background image becomes clear, so that the user effectively prevents the character from being difficult to read by the image.
- the user can arbitrarily set whether or not to change the character color. That is, when the user performs setting to fix the character color, it is possible to effectively prevent the character color displayed on the monitor 2 from being frequently switched according to the color value of the image. Become.
- any of a plurality of moving images displayed on the monitor 2 by a user such as an operator during the operation of the endoscope.
- the user can set a desired display mode so that it is easy to identify whether it is a live video.
- the user arranges each video at a desired position, effectively preventing the user from mistaking the live video and past videos. It becomes possible.
- the present invention is not limited to the above-described embodiments as they are, and can be embodied by modifying the constituent elements without departing from the scope of the invention in the implementation stage.
- various inventions can be formed by appropriately combining a plurality of constituent elements disclosed in the embodiment. For example, all the constituent elements shown in the embodiments may be appropriately combined. Furthermore, constituent elements over different embodiments may be appropriately combined. It goes without saying that various modifications and applications are possible without departing from the spirit of the invention.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Surgery (AREA)
- Radiology & Medical Imaging (AREA)
- Public Health (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Biophysics (AREA)
- Animal Behavior & Ethology (AREA)
- Optics & Photonics (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Physics & Mathematics (AREA)
- Molecular Biology (AREA)
- Pathology (AREA)
- Veterinary Medicine (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Epidemiology (AREA)
- Primary Health Care (AREA)
- Endoscopes (AREA)
- Instruments For Viewing The Inside Of Hollow Bodies (AREA)
Abstract
Description
<第1の実施形態>
図1は、本実施形態に係る医療用動画記録再生システムの全体構成図である。図1に示す医療用動画記録再生システム100は、内視鏡観察装置1、モニタ2、スコープ3、画像ファイリングサーバー(以下サーバーと略記)4、キーボード5、無線LAN(Local Area Network)6及びタブレットPC(Personal Computer)7を有する。
本実施形態に係る医療用動画記録再生システム100では、内視鏡検査において、まず、内視鏡観察装置1が、これから検査を行おうとしている患者を識別する患者情報を元に、サーバー4に患者情報の一致する動画データがすでに記録されていか否かを確認する。サーバー4に患者情報の一致する動画データが記録されているときは、その動画データを、無線LANルータ6を介して内視鏡観察装置1に送信する。内視鏡観察装置1は、スコープ3から受信した動画をライブでモニタ2に表示し、これとともに、サーバー4から受信して自装置のメモリ等に保存した動画データに基づき、過去の動画をモニタ2に表示させる。モニタ2には、過去の動画が再生されるとともに、スコープ3にて撮像中のライブ動画が表示されることとなる。術者は、スコープ3を患者の体腔内に挿入していき、精査を要する箇所、すなわち過去の動画で患部等の現れている箇所を過去の動画を参照して判断し、観察を行う。
再生制御部16は、デコーダ15においてデコードした動画データ及びこれに関連付けられた特徴点情報に基づき、過去の動画の再生を制御する。再生制御部16は、再生を制御している過去の動画のデータを、特徴点情報とともに、合成部12に出力する。
トリガ識別部17は、特徴点情報をこれと対応するフレーム画像に関連付ける際のトリガとなるユーザの操作を認識する。トリガとなるユーザの操作は、スコープスイッチ32、フットスイッチ8、タブレットPC7、及び図2においては不図示の図1のキーボード5等の操作をいう。タブレットPC7との通信は、上述のとおり、タブレット通信部22を介して行う。
図3に示す例では、モニタ2上には、ライブ動画P1及び過去の動画P2を並べて表示する。また、図3に示すように、過去の動画P2脇には、プログレスバーPBを表示してもよい。
<第1の変形例>
<第2の変形例>
<第2の実施形態>
図5は、表示モードの選択画面を例示する図である。図5(a)は、表示モードの設定変更画面W1を例示し、図5(b)は、ユーザが設定可能な表示モードの一覧を表示する画面W2を例示する。
また、図6においては、過去の動画の撮影条件として観察モードを文字で表示する場合を例示しているが、文字として表示する撮影条件は、これには限らない。観察モードに加えて、あるいは、観察モード以外にも、例えば、スコープを識別する情報、拡大倍率等の情報を表示してもよい。これらの撮影条件を表示するか否か、また表示する場合にはいずれの情報を表示するかについても、ユーザに選択させてこれに基づき表示を行う構成とすることもできる。
図7は、モニタ2への表示形態の変更例について説明する図である。
2 モニタ
3 スコープ
4 画像ファイリングサーバー(サーバー)
5 キーボード
6 無線LANルータ
7 タブレットPC
50 ビデオプロセッサ
100 医療用動画記録再生システム
Claims (10)
- 医療用動画データの記録及び再生を行う医療用動画記録再生システムであって、
被検体を撮像して撮像信号を取得する撮像部と、
前記撮像信号に基づいて、前記被写体の動画データを生成する動画データ生成部と、
前記動画データを構成するフレームのうち、指定されたフレームに対して前記被写体の画像として特徴を有する画像であることを示す特徴点情報を生成する特徴点情報生成部と、
前記特徴点情報と前記動画データの対応するフレームとを関連付けて記録する記録部と、
前記記録部から読み出した1以上の動画データに基づき、少なくとも1つの過去の動画の再生を制御する再生制御部と、
前記動画データ生成部において生成したライブの動画データと前記再生制御部から与えられる過去の動画データとを、前記ライブの動画データに基づくライブ動画と前記過去の動画データに基づく過去動画とが同時に表示されるように合成した合成画像データを生成して、得られた合成画像データを出力する合成部と、
前記合成部から出力された前記合成画像データに基づく合成画像を表示する表示部と、
を有し、
前記再生制御部は、前記過去の動画データの前記特徴点情報の付されたフレームにおいて、再生を一時停止させる制御を行う
ことを特徴とする医療用動画記録再生システム。 - 前記合成部は、前記記録部から読み出して再生した過去の動画の再生状況を表すプログレスバーを更に合成させた前記合成画像を生成して前記表示部に出力する
ことを特徴とする請求項1記載の医療用動画記録再生システム。 - 前記合成部は、前記表示部に表示される撮像中及び過去の動画が、それぞれ過去の動画及び撮像中の動画であることを示す文字を重畳させた前記合成画像を生成して前記表示部に出力する
ことを特徴とする請求項1または2記載の医療用動画記録再生システム。 - 前記合成部は、前記表示部に表示される撮像中及び過去の動画のそれぞれに対して、指定された表示枠を更に重畳させた前記合成画像を生成して前記表示部に出力する
ことを特徴とする請求項3記載の医療用動画記録再生システム。 - 前記合成部は、前記表示される過去の動画の観察モードを表す文字を更に重畳させた前記合成画像を生成して前記表示部に出力する
ことを特徴とする請求項4記載の医療用動画記録再生システム。 - 前記表示部は、ユーザが設定可能な前記動画に重畳させる文字及び表示枠の種別を表示させ、前記合成部は、前記表示部に表示した前記種別の中からユーザが選択した文字及び表示枠を用いて前記合成画像を生成する
ことを特徴とする請求項5記載の医療用動画記録再生システム。 - 前記特徴点情報生成部は、内視鏡装置の操作スイッチ、フットスイッチ、前記医療用動画記録再生システムのキーボード及びこれに接続される端末装置を介して指定されたタイミングで、前記特徴点情報を生成する
ことを特徴とする請求項6記載の医療用動画記録再生システム。 - 前記特徴点情報生成部は、内視鏡装置の鉗子孔を通じて鉗子が挿入されたことを検知したタイミングで、前記特徴点情報を生成する
ことを特徴とする請求項6記載の医療用動画記録再生システム。 - 内視鏡装置の挿入部の前記被写体の体腔内への挿入長を検知する挿入長検知部と
を更に備え、
前記再生制御部は、ユーザにより指定された前記挿入部の挿入長に対応する箇所から前記過去の動画を再生させる
ことを特徴とする請求項6記載の医療用動画記録再生システム。 - 医療用動画データの記録及び再生を行う医療用動画記録再生装置であって、
被検体を撮像して撮像信号を取得する撮像部と、
前記撮像信号に基づいて、前記被写体の動画データを生成する動画データ生成部と、
前記動画データを構成するフレームのうち、指定されたフレームに対して前記被写体の画像として特徴を有する画像であることを示す特徴点情報を生成する特徴点情報生成部と、
前記特徴点情報と前記動画データの対応するフレームとを関連付けて記録する記録部から読み出した1以上の動画データに基づき、少なくとも1つの過去の動画の再生を制御する再生制御部と、
前記動画データ生成部において生成したライブの動画データと前記再生制御部から与えられる過去の動画データとを、前記ライブの動画データに基づくライブ動画と前記過去の動画データに基づく過去動画とが同時に表示されるように合成した合成画像データを生成して、得られた合成画像データを表示部に出力する合成部と、
を有し、
前記再生制御部は、前記過去の動画データの前記特徴点情報の付されたフレームにおいて、再生を一時停止させる制御を行う
ことを特徴とする医療用動画記録再生装置。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP14880937.9A EP3100668A4 (en) | 2014-01-30 | 2014-10-30 | Medical video recording and playback system and medical video recording and playback device |
JP2015537058A JP5905168B2 (ja) | 2014-01-30 | 2014-10-30 | 医療用動画記録再生システム及び医療用動画記録再生装置 |
CN201480070403.1A CN105848559B (zh) | 2014-01-30 | 2014-10-30 | 医疗用运动图像记录再现系统以及医疗用运动图像记录再现装置 |
US15/205,933 US20160323514A1 (en) | 2014-01-30 | 2016-07-08 | Medical video recording and reproducing system and medical video recording and reproducing device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014016101 | 2014-01-30 | ||
JP2014-016101 | 2014-01-30 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/205,933 Continuation US20160323514A1 (en) | 2014-01-30 | 2016-07-08 | Medical video recording and reproducing system and medical video recording and reproducing device |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015114901A1 true WO2015114901A1 (ja) | 2015-08-06 |
Family
ID=53756499
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/078975 WO2015114901A1 (ja) | 2014-01-30 | 2014-10-30 | 医療用動画記録再生システム及び医療用動画記録再生装置 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20160323514A1 (ja) |
EP (1) | EP3100668A4 (ja) |
JP (1) | JP5905168B2 (ja) |
CN (1) | CN105848559B (ja) |
WO (1) | WO2015114901A1 (ja) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017093996A (ja) * | 2015-11-27 | 2017-06-01 | オリンパス株式会社 | 内視鏡システム及びその表示方法 |
WO2017126313A1 (ja) * | 2016-01-19 | 2017-07-27 | 株式会社ファソテック | 生体質感臓器を用いる手術トレーニング及びシミュレーションシステム |
JP2018007960A (ja) * | 2016-07-15 | 2018-01-18 | Hoya株式会社 | 内視鏡装置 |
WO2019049451A1 (ja) * | 2017-09-05 | 2019-03-14 | オリンパス株式会社 | ビデオプロセッサ、内視鏡システム、表示方法、及び表示プログラム |
WO2019198322A1 (ja) * | 2018-04-10 | 2019-10-17 | オリンパス株式会社 | 医療システム |
WO2024202956A1 (ja) * | 2023-03-27 | 2024-10-03 | ソニーグループ株式会社 | 医療用データ処理装置、医療用システム |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019039252A1 (ja) * | 2017-08-24 | 2019-02-28 | 富士フイルム株式会社 | 医療画像処理装置及び医療画像処理方法 |
CN110047587A (zh) * | 2018-09-29 | 2019-07-23 | 苏州爱医斯坦智能科技有限公司 | 一种医疗数据采集方法、装置、设备及存储介质 |
CN114630124B (zh) * | 2022-03-11 | 2024-03-22 | 商丘市第一人民医院 | 一种神经内窥镜备份方法及系统 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09154811A (ja) * | 1995-12-07 | 1997-06-17 | Asahi Optical Co Ltd | 電子内視鏡装置 |
JP2007075158A (ja) * | 2005-09-09 | 2007-03-29 | Olympus Medical Systems Corp | 画像表示装置 |
JP2011036370A (ja) | 2009-08-10 | 2011-02-24 | Tohoku Otas Kk | 医療画像記録装置 |
JP2012045419A (ja) * | 2011-11-14 | 2012-03-08 | Fujifilm Corp | 医用画像表示装置、医用画像表示システム及び医用画像表示方法、並びに内視鏡装置 |
JP2012161537A (ja) * | 2011-02-09 | 2012-08-30 | Konica Minolta Medical & Graphic Inc | 超音波診断システム、超音波診断装置及びプログラム |
JP2012170774A (ja) * | 2011-02-24 | 2012-09-10 | Fujifilm Corp | 内視鏡システム |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2004103167A1 (ja) * | 2003-05-22 | 2004-12-02 | Olympus Corporation | 画像記録装置 |
JP4698966B2 (ja) * | 2004-03-29 | 2011-06-08 | オリンパス株式会社 | 手技支援システム |
US20060009679A1 (en) * | 2004-07-08 | 2006-01-12 | Pentax Corporation | Electronic endoscope system capable of detecting inserted length |
JP2007058334A (ja) * | 2005-08-22 | 2007-03-08 | Olympus Corp | ファイリング装置およびファイリングシステム |
CN102802498B (zh) * | 2010-03-24 | 2015-08-19 | 奥林巴斯株式会社 | 内窥镜装置 |
-
2014
- 2014-10-30 EP EP14880937.9A patent/EP3100668A4/en not_active Withdrawn
- 2014-10-30 WO PCT/JP2014/078975 patent/WO2015114901A1/ja active Application Filing
- 2014-10-30 CN CN201480070403.1A patent/CN105848559B/zh not_active Expired - Fee Related
- 2014-10-30 JP JP2015537058A patent/JP5905168B2/ja not_active Expired - Fee Related
-
2016
- 2016-07-08 US US15/205,933 patent/US20160323514A1/en not_active Abandoned
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09154811A (ja) * | 1995-12-07 | 1997-06-17 | Asahi Optical Co Ltd | 電子内視鏡装置 |
JP2007075158A (ja) * | 2005-09-09 | 2007-03-29 | Olympus Medical Systems Corp | 画像表示装置 |
JP2011036370A (ja) | 2009-08-10 | 2011-02-24 | Tohoku Otas Kk | 医療画像記録装置 |
JP2012161537A (ja) * | 2011-02-09 | 2012-08-30 | Konica Minolta Medical & Graphic Inc | 超音波診断システム、超音波診断装置及びプログラム |
JP2012170774A (ja) * | 2011-02-24 | 2012-09-10 | Fujifilm Corp | 内視鏡システム |
JP2012045419A (ja) * | 2011-11-14 | 2012-03-08 | Fujifilm Corp | 医用画像表示装置、医用画像表示システム及び医用画像表示方法、並びに内視鏡装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3100668A4 |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017093996A (ja) * | 2015-11-27 | 2017-06-01 | オリンパス株式会社 | 内視鏡システム及びその表示方法 |
WO2017126313A1 (ja) * | 2016-01-19 | 2017-07-27 | 株式会社ファソテック | 生体質感臓器を用いる手術トレーニング及びシミュレーションシステム |
JPWO2017126313A1 (ja) * | 2016-01-19 | 2018-11-22 | 株式会社ファソテック | 生体質感臓器を用いる手術トレーニング及びシミュレーションシステム |
JP2018007960A (ja) * | 2016-07-15 | 2018-01-18 | Hoya株式会社 | 内視鏡装置 |
WO2019049451A1 (ja) * | 2017-09-05 | 2019-03-14 | オリンパス株式会社 | ビデオプロセッサ、内視鏡システム、表示方法、及び表示プログラム |
WO2019198322A1 (ja) * | 2018-04-10 | 2019-10-17 | オリンパス株式会社 | 医療システム |
US12042339B2 (en) | 2018-04-10 | 2024-07-23 | Olympus Corporation | Medical system and method of controlling medical system |
WO2024202956A1 (ja) * | 2023-03-27 | 2024-10-03 | ソニーグループ株式会社 | 医療用データ処理装置、医療用システム |
Also Published As
Publication number | Publication date |
---|---|
CN105848559B (zh) | 2018-09-14 |
JP5905168B2 (ja) | 2016-04-20 |
EP3100668A4 (en) | 2017-11-15 |
EP3100668A1 (en) | 2016-12-07 |
CN105848559A (zh) | 2016-08-10 |
US20160323514A1 (en) | 2016-11-03 |
JPWO2015114901A1 (ja) | 2017-03-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5905168B2 (ja) | 医療用動画記録再生システム及び医療用動画記録再生装置 | |
JP5347089B1 (ja) | 医療情報記録装置 | |
JP5810248B2 (ja) | 内視鏡システム | |
US10426322B2 (en) | Image recording apparatus | |
US20200396411A1 (en) | Information processor, information processing method, and program | |
CN109863755B (zh) | 信号处理设备、方法和程序 | |
JP2004181229A (ja) | 遠隔手術支援システム及び支援方法 | |
US11323679B2 (en) | Multi-camera system, camera, processing method of camera, confirmation apparatus, and processing method of confirmation apparatus | |
CN109565565B (zh) | 信息处理装置、信息处理方法和非暂态计算机可读介质 | |
US11599263B2 (en) | Information processing device, method, and program for generating a proxy image from a proxy file representing a moving image | |
JPWO2018193799A1 (ja) | 情報処理装置、および情報処理方法、並びにプログラム | |
WO2011152489A1 (ja) | 画像記録システム及び画像記録方法 | |
CN108353144B (zh) | 多摄像机系统、摄像机、摄像机处理方法、确认设备和确认设备处理方法 | |
JP7264051B2 (ja) | 画像処理装置および画像処理方法 | |
US9782060B2 (en) | Medical system | |
WO2017126156A1 (ja) | 内視鏡システム | |
JP2017006260A (ja) | 内視鏡装置 | |
JP2006020875A (ja) | 内視鏡観察装置 | |
JP2017131499A (ja) | 内視鏡装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2015537058 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14880937 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REEP | Request for entry into the european phase |
Ref document number: 2014880937 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2014880937 Country of ref document: EP |