CN117500429A - Medical program and medical examination system - Google Patents

Medical program and medical examination system Download PDF

Info

Publication number
CN117500429A
CN117500429A CN202280042725.XA CN202280042725A CN117500429A CN 117500429 A CN117500429 A CN 117500429A CN 202280042725 A CN202280042725 A CN 202280042725A CN 117500429 A CN117500429 A CN 117500429A
Authority
CN
China
Prior art keywords
video data
recording
real
still image
function
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202280042725.XA
Other languages
Chinese (zh)
Inventor
铃木隆裕
觉内笃志
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Kowa Co Ltd
Original Assignee
Kowa Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kowa Co Ltd filed Critical Kowa Co Ltd
Priority claimed from PCT/JP2022/024128 external-priority patent/WO2022265064A1/en
Publication of CN117500429A publication Critical patent/CN117500429A/en
Pending legal-status Critical Current

Links

Landscapes

  • Medical Treatment And Welfare Office Work (AREA)

Abstract

In order to perform live broadcasting and recording of a video generated by photographing by a medical examination apparatus while maintaining a sense of use of the medical examination apparatus having a photographing camera, a medical program is implemented in which various processes are implemented in a terminal apparatus used with the medical examination apparatus having a photographing camera, the medical program implementing: a function of displaying video data, which is transmitted in real time while being captured by the medical examination device, on a display unit of the terminal device by live broadcasting; a function of controlling start/stop of recording of video data in live play according to a prescribed operation.

Description

Medical program and medical examination system
Technical Field
The present invention relates to a medical program for examination of an examination subject and a medical examination system.
Background
Conventionally, a medical examination apparatus for performing an examination of an examination object by photographing the examination object has been used. For example, patent document 1 discloses the following medical examination device: imaging data is generated by imaging an inspection object, and the generated data is recorded in a predetermined recording unit.
Prior art literature
Patent literature
Patent document 1: JP patent No. 4136559.
Disclosure of Invention
Problems to be solved by the invention
According to the medical examination apparatus described in patent document 1, a plurality of pieces of imaging data generated by imaging an examination subject can be recorded in a recording unit included in the medical examination apparatus. However, if the medical examination apparatus is configured to record (record) video, the Usability (Usability) of the medical examination apparatus may be impaired. For example, since the medical examination device is required to have a storage function, there is a problem in that the medical examination device becomes large in size. In addition, for example, when the medical examination apparatus cannot write information into recorded data and when the medical examination apparatus records a plurality of subjects, the medical examination apparatus transmits data to another terminal, there is a problem that the subjects may be mistaken.
The present invention has been made in view of the above-described problems, and an object of the present invention is to provide a medical program and a medical examination system capable of recording video data generated by imaging with a medical examination device having an imaging camera while maintaining the sense of use of the medical examination device.
Means for solving the problems
The medical program according to the present invention is a medical program for realizing various processes in a terminal device used with a medical examination device having a photographing camera, and is characterized in that the medical program realizes: a receiving function of receiving various data including video data (hereinafter, referred to as real-time video data) transmitted in real time while being photographed by the medical examination apparatus; a display control function of displaying a video represented by the received real-time video data by live broadcasting on a display unit included in the terminal device; a recording function of controlling start/stop of recording of the real-time video data in live play in accordance with a prescribed operation performed on the terminal device; and a recording control function for recording the real-time video data, which is recorded by the recording function, in a recording unit provided in the terminal device until the recording is stopped, as recorded video data.
In the medical program according to the present invention, the display control function may be realized by: causing a first button indicating start/stop of recording of the real-time video data in live play to be displayed in the display section as a touch screen provided in the terminal device, the following functions being realized in the recording function: controlling start/stop of recording of the real-time video data in live play according to an input operation to the first button.
In the medical program according to the present invention, the receiving function may be realized by: receiving, from the medical examination apparatus, information indicating a control request for starting/stopping recording the real-time video data based on an operation performed on a predetermined operation unit physically provided to the medical examination apparatus, the recording function including: and controlling start/stop of recording of the real-time video data in live broadcasting according to the information indicating the control request received from the medical examination device.
In the medical program according to the present invention, the display control function may be realized by: when the recorded video data is played and displayed on the display unit, a second button for receiving an input operation for extracting still image data from the recorded video data is displayed on the display unit, and the following functions are realized in the recording control function: when the second button is input, still image data corresponding to the still image displayed on the display unit is recorded in the recording unit.
In the medical program according to the present invention, a reception function of receiving an input operation of identifying identification information for identifying an inspection object may be realized, and the recording control function may be realized by: the real-time video data is recorded in the recording unit as the recorded video data only in the case where the input operation of the identification information is received.
In the medical program according to the present invention, the recording control function may be realized by: and recording the real-time video data as the recorded video data in the recording unit in association with the identification information whose input is accepted.
In the medical program according to the present invention, a function of receiving an input operation of identifying identification information for identifying an inspection object may be realized, and the display control function may be realized by: and displaying the video represented by the received real-time video data by live play on the display unit only when the input operation of the identification information is received.
In the medical program according to the present invention, the display control function may be realized by: when displaying the video represented by the real-time video data, the identification information representing the inspection object is displayed on the display unit simultaneously with the display of the video.
In the medical program according to the present invention, the medical examination device may be a device for examining an eye to be examined, and the display control function may be implemented by: a third button for selecting which of left and right eyes of the subject is the subject to be inspected is displayed on the display unit, and the following functions are implemented in the recording control function: and recording the real-time video data in the recording unit as the recorded video data in correspondence with the left and right selection information of the eye to be inspected, in accordance with the selection operation performed on the third button.
In the medical program according to the present invention, the display control function may be realized by: when displaying a video represented by the recorded video data, the left and right selection information is displayed on the display unit in association with the video.
The medical program according to the present invention may be characterized by including: a transmission function of transmitting the recorded video data to a predetermined server apparatus; and a communication control function that performs control so that, of communication in which the reception of the real-time video data is performed by the reception function and communication in which the transmission of the recorded video data is performed by the transmission function, communication with one party is prohibited while communication with the other party is in progress.
The medical examination system according to the present invention is a medical examination system including a medical examination apparatus provided with a photographing camera, a terminal apparatus, and a server apparatus, wherein the medical examination apparatus transmits video data generated by photographing an examination object to the terminal apparatus in real time, and the terminal apparatus includes: a receiving unit that receives various data including video data (hereinafter, referred to as real-time video data) that is transmitted in real time while being captured by the medical examination device; a display control unit that displays a video represented by the received real-time video data by live broadcasting on a display unit included in the terminal device; a recording unit that controls start/stop of recording of the real-time video data during live broadcasting in accordance with a predetermined operation performed on the terminal device; a recording control unit that causes recording of the real-time video data by the recording unit to be performed in a recording unit provided in the terminal, the recording unit being configured to record the real-time video data as recorded video data in a period from start to stop of recording of the real-time video data; and a transmitting unit that transmits the recorded video data to the server device.
In the medical examination system according to the present invention, the server device may classify and manage the recorded video data received from the terminal device based on identification information associated with the terminal device as information for identifying the examination object.
In the medical examination system according to the present invention, the medical examination device may function as one access point, and may not be connected to other devices when a communication connection is established with the terminal device.
An image processing program according to the present invention is an image processing program for causing a computer to realize functions related to image processing, the program causing the computer to realize: a video data acquisition function of acquiring video data of a processing object; a Focus Level calculating function of calculating a Focus Level of a frame constituting a part or all of the video data; a still image extraction function of extracting, as extracted still image data, a part or all of frames (candidate frames) having a focus degree equal to or higher than a predetermined value among the frames having the focus degree calculated; and a recording function of causing the extracted still image data to be recorded in a recording unit provided in the computer.
In the image processing program according to the present invention, in the still image extraction function, when a plurality of the candidate frames exist within a predetermined time range when the candidate frames are arranged in time series, a predetermined number of the candidate frames may be excluded from the extraction targets from the candidate frames having the small focusing degree.
In the image processing program according to the present invention, when the candidate frames are sequentially focused on in time order in extracting the extracted still image data from the candidate frames in the still image extracting function, the candidate frames may be excluded from the extraction targets when the candidate frames having the focusing power smaller than that of the focused frame exist within a predetermined time range from a time point of the focused frame.
In the image processing program according to the present invention, in the still image extraction function, when extracting the extracted still image data from the candidate frames, the frame having the smallest focusing degree among the candidate frames may be excluded from the extraction targets so that the total number of the extracted still image data does not exceed a predetermined upper limit number.
In the image processing program according to the present invention, the calculation of the degree of focusing in the degree of focusing calculation function may be performed by an edge extraction image generation process of generating an edge extraction image by performing an edge extraction process on the frame, and a calculation process of calculating the degree of focusing of the frame corresponding to the edge extraction image based on a distribution of edges in the edge extraction image.
In the image processing program according to the present invention, the focusing degree calculation function may perform a green component image generation process of extracting a green component from color information of each pixel constituting the frame to generate green component image data, and a gradation process of gradation-generating gradation image data from the green component image data, and the focusing degree may be calculated from the gradation image data obtained by the gradation process.
In the image processing program according to the present invention, the focusing degree calculation function may perform adjustment processing for adjusting a Black Level (Black Level) of the gradation image data, and the focusing degree may be calculated from the gradation image data of which Black Level has been adjusted by the adjustment processing.
In the image processing program according to the present invention, the focusing degree calculation function may set one frame as the target frame for calculating the focusing degree at predetermined time intervals or at predetermined frame intervals among frames constituting the video data.
In the image processing program according to the present invention, the image data may be data of an image obtained by capturing an eye of the subject, and the focusing power calculation function may calculate the focusing power with respect to a region of the frame from which a region of a predetermined range on the upper side and/or the lower side is removed.
In the image processing program according to the present invention, the video data acquisition function may be configured to store the acquired video data in a storage unit included in the computer, and the computer may be configured to: and a deleting function of deleting the video data stored in the storage unit at a predetermined timing after the still image data is extracted by the still image extracting function.
In addition, the image processing program according to the present invention may be configured to cause the computer to: a display control function of causing a display unit of the computer to display an extracted still image represented by the extracted still image data; a receiving function for receiving a selection operation for the displayed extracted still image from a user; the following functions are implemented in the recording function: the extracted still image data representing the extracted still image that has accepted the selection operation is recorded in the recording unit.
In the image processing program according to the present invention, the display control function may be realized by: when a predetermined first input operation is performed, a video represented by video data including the extracted still image is displayed on the display unit, and the following functions are realized in the reception function: accepting a selection operation of the frame of the video displayed, the following functions being implemented in the recording function: still image data representing the frame that has accepted the selection operation is recorded in the recording unit.
In the image processing program according to the present invention, in the display control function, when a predetermined second input operation is performed on any one of the extracted still images represented by the extracted still image data, the video data may be continuously played back after the extracted still image data.
ADVANTAGEOUS EFFECTS OF INVENTION
According to the present invention, it is possible to provide a medical program and a medical examination system capable of recording video data generated by imaging of a medical examination device having an imaging camera while maintaining the sense of use of the medical examination device.
Further, according to the present invention, it is possible to provide an image processing program capable of automatically extracting and recording still image data from video data.
Drawings
Fig. 1 is an explanatory diagram illustrating an example of a configuration of a medical examination system 100 according to at least one embodiment of the present invention.
Fig. 2 is a block diagram showing an example of the functional configuration of the terminal device 20 according to at least one embodiment of the present invention.
Fig. 3 is a flowchart showing an example of a flow of processing performed by the medical examination apparatus 10 according to at least one embodiment of the present invention.
Fig. 4 is a flowchart showing an example of a flow of processing performed by the terminal device 20 according to at least one embodiment of the present invention.
Fig. 5 is an explanatory diagram showing an example of a display screen of the terminal device 20 according to at least one embodiment of the present invention.
Fig. 6 is an explanatory diagram showing an example of a display screen of the terminal device 20 according to at least one embodiment of the present invention.
Fig. 7 is an explanatory diagram showing an example of a display screen of the terminal device 20 according to at least one embodiment of the present invention.
Fig. 8 is an explanatory diagram showing an example of a display screen of the terminal device 20 according to at least one embodiment of the present invention.
Fig. 9 is an explanatory diagram showing an example of a display screen of the terminal device 20 according to at least one embodiment of the present invention.
Fig. 10 is a block diagram showing an example of the functional configuration of the image processing apparatus 10-1 according to at least one embodiment of the present invention.
Fig. 11 is an explanatory diagram for explaining an example of an environment to which the image processing apparatus 10-1 according to at least one embodiment of the present invention is applied.
Fig. 12 is a flowchart showing an example of a flow of frame extraction processing performed by the image processing apparatus 10-1 according to at least one embodiment of the present invention.
Fig. 13 is a flowchart showing an example of a flow of the frame focusing degree calculation processing performed by the image processing apparatus 10-1 according to at least one embodiment of the present invention.
Fig. 14 is an explanatory diagram for explaining an example of management of video data and still image data in the image processing apparatus 10-1 according to at least one embodiment of the present invention.
Fig. 15 is an explanatory diagram for explaining an extraction processing example of extracting still image data from candidate frames in the image processing apparatus 10-1 according to at least one embodiment of the present invention.
Detailed Description
First embodiment
Hereinafter, an example of the medical examination system 100 according to the embodiment of the present invention will be described with reference to the drawings. Fig. 1 is an explanatory diagram illustrating an example of a configuration of a medical examination system 100 according to at least one embodiment of the present invention. As shown in fig. 1, a medical examination system 100 according to an embodiment of the present invention includes a medical examination apparatus 110, a terminal apparatus 120, and a server apparatus 130.
The medical examination apparatus 110 is an apparatus for performing an examination on an examination subject. The examination object is, for example, a human or an animal. The medical examination device 110 has at least a camera for photographing and a function of photographing an examination subject. The medical examination device 110 has a communication function, and in this example, performs wireless communication with the terminal device 120. In the case where the medical examination apparatus 110 and the terminal apparatus 120 perform wired communication, the medical examination apparatus 110 and the terminal apparatus 120 are connected by, for example, a USB (universal serial bus (Universal Serial Bus)) cable or the like.
The medical examination apparatus 110 photographs the examination subject and simultaneously transmits video data to the terminal apparatus 120 in real time. Here, transmitting video data in real Time while capturing means that the captured video data is transmitted to the terminal device 120 as it is so as not to generate Time Lag (Time Lag). Hereinafter, video data transmitted in real time while photographing will be referred to as "real-time video data".
In this example, the medical examination apparatus 110 is a hand-held apparatus called a so-called slit lamp, which irradiates an eye of an object to be examined (hereinafter referred to as an "eye to be examined") with slit light, and inspects a cornea, a crystalline lens, and the like of the eye to be examined by observing scattered light generated by scattering of the eye to be examined. The medical examination device 110 is not limited to the above example, and may be a handheld medical examination device such as a handheld fundus camera or a stationary medical device such as a dry eye examination device, in addition to a handheld slit lamp.
The medical examination device 110 may be configured to function as one wireless LAN (local area network (Local Area Network)) access point, and may be configured to limit one other device that can be simultaneously connected in communication. For example, the medical examination device 110 cannot be communicatively connected to another device when a communication connection is established with the terminal device 120. By performing communication control in this way, it is possible to prevent a device that is irrelevant when shooting is performed during an examination from receiving video data and displaying a video represented by the video data, and as a result, personal information can be protected. Such a communication control method may be a method in which the medical examination apparatus 110 does not have to screen information of a communication target. This is because, unlike a method of realizing one-to-one communication based on the same ID added to a transmission/reception signal between devices, for example, this communication control method can perform communication with a device that first establishes communication (e.g., establishes a WiFi (registered trademark) connection) with the medical examination device 110.
The terminal device 120 is a device used together with the medical examination device 110. In this example, the terminal device 120 is a tablet terminal with a touch screen.
The terminal device 120 may be a device designed as a dedicated device, but may be implemented by a general computer. That is, the terminal device 120 includes at least a CPU (Central Processing Unit: central processing unit) and a memory, which are generally provided in a general computer. The processing of the terminal device 120 is realized by reading a program for executing the processing from a memory and executing the processing in a CPU or GPU (Graphics Processing Unit: image processing device) functioning as a control circuit (processing circuit (Processing circuit, processing circuitry)). In other words, the constitution is as follows: by execution of this program, a processor (processing circuit) can execute each process of each device. In the following, the terminal device 120 may be any device capable of executing processing equivalent to a computer, and may be implemented by a smart phone, a tablet terminal, or the like, for example. As described above, the terminal device 120 is a tablet terminal in this example.
The terminal device 120 has a function of communicating with the medical examination device 110 and the server device 130. Specifically, the terminal device 120 performs wireless communication with the medical examination device 110, performs wired or wireless communication with the server device 130, and performs transmission and reception of various information. More specifically, the terminal device 120 receives real-time video data transmitted from the medical examination device 110, and records the data in a recording unit (hereinafter referred to as a terminal-side recording unit) included in the device. Then, the terminal device 120 transmits the recorded real-time video data (hereinafter also referred to as recorded video data) to the server device 130. In the case where the terminal device 120 and the server device 130 perform wired communication, the terminal device 120 and the server device 130 are connected by, for example, a USB cable or the like.
The terminal device 120 further includes a display unit (hereinafter referred to as a terminal display unit) and displays various information on the terminal display unit. Specifically, the terminal device 120 displays a video represented by real-time video data received from the medical examination device 110 or a video represented by recorded video data recorded in the terminal-side recording means on the terminal display unit. In this example, the terminal device 120 as a tablet terminal has a touch panel functioning as a terminal display unit, displays various information including a video represented by real-time video data or a video represented by recorded video data on the touch panel, and receives various input operations from a user by a touch operation performed on the touch panel.
The server device 130 is a device for managing various information such as video data transmitted from the terminal device 120. The server apparatus 130 has a recording unit that records recorded video data or still image data received from the terminal apparatus 120. The recorded video data recorded in the recording unit may be classified and managed based on the identification information (added when the terminal device 120 performs recording) associated with the terminal device 120. In this example, the server apparatus 130 creates corresponding identification information from the terminal apparatus 120, and records the recorded video data in the recording unit of the server apparatus 130 in correspondence with the electronic medical record corresponding to the subject. The recording means for recording the recorded video data transmitted from the terminal device 120 is not particularly limited as long as it is a means different from the terminal-side recording means. Other examples of recording units that record recorded video data transmitted from the terminal device 120 are recording units belonging to the same network as the server device 130.
The communication between the server apparatus 130 and the terminal apparatus 120 may be performed by wired connection such as a USB cable, or may be performed by wireless using a wireless LAN or the like. The server device 130 is a device for managing electronic medical records and the like provided in an in-hospital LAN of a hospital, for example.
Fig. 2 is a block diagram showing an example of the functional configuration of the terminal device 120 according to at least one embodiment of the present invention.
As shown in fig. 2, the terminal device 120 includes a receiving unit 121, a display control unit 122, a recording unit 123, a recording control unit 124, a receiving unit 125, a transmitting unit 126, and a communication control unit 127. First, an outline of a functional configuration example of the terminal device 120 will be described.
The receiving section 121 has the following functions: various data including video data (real-time video data) transmitted in real time while being photographed by the medical examination apparatus 110 is received.
The receiving unit 121 may receive, from the medical examination device 110, information indicating a control request for starting/stopping recording of real-time video data to the terminal-side recording means based on an operation performed on a predetermined operation unit physically provided in the medical examination device 110. Here, the operation unit is, for example, a button or a Dial Switch (Dial Switch). The information indicating the control request for starting and stopping recording is not particularly limited, but information of the content shared in advance between the medical examination apparatus 110 and the terminal apparatus 120 is preferable.
The display control unit 122 has a function of displaying a video represented by the received real-time video data on the terminal display unit by live broadcasting. Specifically, the display control unit 122 has a function of displaying the video indicated by the real-time video data on the terminal display unit immediately after the real-time video data is received by the receiving unit 121. By performing live-action display in this way, it is possible to perform the photographing operation while checking the status of photographing performed by the medical examination apparatus 110.
The recording section 123 has the following functions: the start/stop of recording of real-time video data in live play is controlled in accordance with a predetermined operation performed on the terminal device 120. Here, controlling the start/stop of the recording of the real-time video data during live play means starting the recording of the real-time video data in the terminal-side recording unit in accordance with a predetermined operation as a trigger for starting the recording, and stopping the recording of the real-time video data in the terminal-side recording unit in accordance with a predetermined operation as a trigger for ending the recording. Examples of the predetermined operation include an input operation performed on a recording start button and a stop button displayed on the terminal display unit, and an operation performed on an operation unit that functions as a recording start button and a stop button physically provided on the terminal device 120. The start/stop of recording is managed by a predetermined flag, for example. The recording unit 123 may control the start/stop of recording of the real-time video data during live broadcasting based on the information indicating the control request received by the receiving unit 121 from the medical examination device 110.
The recording control section 124 has the following functions: real-time video data from the start of recording by the recording unit 123 to the stop of recording is recorded as recorded video data in the terminal-side recording unit. Here, the configuration in which real-time video data is recorded as recorded video data in the terminal-side recording means is not particularly limited, but is preferably a configuration in which real-time video data is recorded in association with predetermined information. Examples of the predetermined information include identification information for uniquely identifying the test subject and a photographing time of the medical test device 110. The identification information is not particularly limited as long as the inspection object can be identified.
The receiving unit 125 has a function of receiving an input operation for identifying identification information of the inspection object. The identification information acquired by the reception unit 125 is recorded in association with the recorded video data recorded by the recording control unit 124. Here, the input operation may be performed by using the following method in addition to the input performed by using the soft key displayed on the terminal display unit: when the terminal device 120 is provided with a hard key (physical key), input by the hard key, or input by voice when the terminal device 120 has a microphone function, or input of identification information by reading a one-dimensional/two-dimensional barcode indicating identification information by the barcode reading function or the camera function when the terminal device 120 has the barcode reading function or the camera function. In addition, in the case where the terminal device 120 has a camera function in the input operation, the input may be performed by reading the identification information recorded on the card photographed by the terminal device 120 by an OCR (optical character recognition (Optical Character Recognition)) technique. Of course, they may be used in combination. The display control unit 122 displays the video indicated by the received real-time video data by live broadcasting on the terminal display unit only when the input operation of the identification information is accepted. The display control unit 122 may display the video indicated by the real-time video data on the terminal display unit together with the identification information. Then, the recording control unit 124 may record the real-time video data as recorded video data in the terminal-side recording unit only when the input operation of the identification information is received. Here, the recording control unit 124 may record the real-time video data as recorded video data in the terminal-side recording means in association with the identification information received as input. By adopting such a configuration, management of recorded video data is facilitated.
The transmitting unit 126 has a function of transmitting recorded video data to a predetermined server device. The predetermined server apparatus corresponds to the server apparatus 130 in fig. 1. Here, the recorded video data transmitted to the server apparatus 130 is determined by a user's selection operation, for example. Hereinafter, the recorded video data transmitted to the server apparatus 130 is referred to as "server management video data". The server device 130 causes the received server management video data to be recorded in a recording unit of the device or in a recording unit of another device belonging to the same network as the device.
The communication control unit 127 has the following functions: the control is performed so that, of the communication for receiving the real-time video data by the receiving unit 121 and the communication for transmitting the recorded video data by the transmitting unit 126, the communication with the other is prohibited while the communication of the one is in progress. For example, the communication control unit 127 prohibits communication in which the transmission unit 126 transmits the recorded video data during communication in which the reception unit 121 receives the real-time video data. Here, the communication of the reception of the real-time video data by the reception unit 121 may be regarded as continuing until the reception of the real-time video data by the reception unit 121 is stopped by stopping the transmission of the real-time video data from the medical examination device 110.
Hereinafter, a configuration of controlling GUI (graphical user interface (Graphical User Interface)) displayed on the terminal display unit of the terminal device 120 by the display control unit 122 will be described.
The display control unit 122 may cause the terminal display unit, which is a touch panel included in the terminal device 120, to display a first button for instructing to start or stop recording of live video data during live broadcasting to the terminal-side recording means. Then, the recording unit 123 may control the start/stop of recording of the real-time video data in live play in accordance with the input operation performed on the first button. For example, when a first button displayed on the touch panel is touched, a process of controlling the start/stop of the recording of real-time video data in live broadcasting may be performed. The display mode of the first button is not particularly limited, but is preferably a mode in which the user can recognize the content related to the control process of starting/stopping the recording.
The display control unit 122 may display a second button for receiving an input operation for extracting still image data from recorded video data when the recorded video data is played and displayed on the terminal display unit. The input operation here is, for example, a touch operation. The display form of the second button is not particularly limited, but a form in which the user can recognize the content related to the extraction request of the still image is preferable. The recording control unit 124 may record still image data corresponding to the still image displayed on the terminal display unit in the terminal-side recording unit when the second button is input. With such a configuration, when a still image is required, the still image can be easily extracted from the video. Still image data recorded in the terminal-side recording unit may be transmitted to a predetermined server device (for example, the server device 130) through the transmitting unit 126.
In the case where the medical examination apparatus 110 is an apparatus for examining an eye to be examined, the display control unit 122 may display a third button for selecting which of the left and right eyes to be examined of the examinee is an examination target on the terminal display unit. The display mode of the third button is not particularly limited, but is preferably a mode in which the user can recognize what is related to the selection of which of the left and right eyes to be inspected is the inspection object.
When recording real-time video data as recorded video data, the recording control unit 124 may record, in the terminal-side recording means, left and right selection information of the eye to be inspected, which is determined based on a selection operation performed on the third button, in association with the recorded video data. By adopting such a configuration, for example, it is not necessary to make a post-finishing, which of the eyes is inspected, and therefore, management of data can be facilitated.
Further, when displaying a video represented by the recorded video data, the display control unit 122 may cause the terminal display unit to display the left and right selection information in association with the video. Examples of the left and right selection information include text information or icons indicating left or right. By adopting such a configuration, it is possible to confirm which of the left and right is currently selected, and it is possible to easily confirm whether or not the left and right selection information is correct.
The display control unit 122 may perform processing for changing the play range or play method of the video according to a predetermined operation at the time of playing such as enlarged play, with respect to the video indicated by the displayed recorded video data. Here, the predetermined operation means a predetermined type of operation performed on a predetermined portion on the terminal display unit. Examples of prescribed operations are touching or dragging, zooming in, zooming out, long pressing, sliding. In addition, examples of the process of changing the play range of the video or the play method are a process of changing the display range of a still image in a certain frame of the video or an adjustment of the display time in the video (adjustment of a frame displayed among a plurality of frames in the video). For example, when playing recorded video data and displaying a video on the terminal display unit, the display control unit 122 may display the video by enlarging or reducing the display range according to a predetermined operation. The recording control unit 124 may also record the video displayed by changing the display range by the display control unit 122. In addition, when the process of changing the play range or the play method of the video is performed, the recording control unit 124 may record, in the terminal-side recording unit, still image data corresponding to a still image displayed on the terminal display unit among the video when the second button is input. For example, when the second button is input during the display of the video displayed by changing the display range by the display control unit 122, the recording control unit 124 may record the still image data corresponding to the still image of the display range displayed by the terminal display unit in the terminal-side recording unit. When the second button is input in the case of enlarging and displaying the video represented by the recorded video data, the enlarged still image displayed when the second button is input is recorded in the terminal-side recording unit.
In addition, the terminal device 120 may be capable of executing editing processing of the recorded video data. For example, the display control unit 122 may display consecutive frames of video data in chronological order, so that unnecessary frames can be easily deleted.
Fig. 3 is a flowchart showing an example of the flow of the operation of the medical examination apparatus 110 according to at least one embodiment of the present invention.
In fig. 3, the operation of the medical examination apparatus 110 is started by photographing the examination subject (step S101). Next, the medical examination apparatus 110 generates video data by imaging an examination subject (step S102). Here, the video data generation process is executed in parallel with the operation of step S101. Next, the medical examination apparatus 110 starts transmitting the generated video data to the terminal apparatus 120 in real time (step S103). In this example, after step S103, the transmission of the video data by the medical examination apparatus 110 continues.
Fig. 4 is a flowchart showing an example of a flow of processing of the terminal device 120 according to at least one embodiment of the present invention.
In fig. 4, the operation of the terminal device 120 is started by an input operation of a photographing start button displayed on the terminal display unit after the communication connection with the medical examination device 110 is established (step S151). Next, the terminal device 120 accepts an input operation of identification information (step S152). In this example, the terminal device 120 accepts an input operation by a soft key displayed on the terminal display unit. Next, when receiving the input operation of the identification information in step S152, the terminal device 120 starts receiving various data including the real-time video data captured by the medical examination device 110 (step S153). Next, the terminal device 120 starts displaying the video represented by the real-time video data by live broadcasting on the terminal display unit (step S154). In this flowchart, the description has been given as to the configuration in which live play of real-time video data is started after identification information is input, but the configuration is not necessarily limited thereto. The live broadcast may be started at a point in time when the communication connection between the medical examination apparatus 110 and the terminal apparatus 120 is established.
Next, the terminal device 120 causes the terminal display unit, which is a touch panel, to display a first button for instructing to start/stop recording of live video data in live broadcasting to the terminal-side recording unit (step S155). In this example, the terminal device 120 displays a recording control button as a first button on the lower side of a video indicated by real-time video data during live play on a display screen of the terminal display unit. The recording control button here is a switching button that serves as both a button for starting recording and a button for stopping recording.
Next, when the terminal device 120 receives an operation to end the live play of the video indicated by the real-time video data (yes in step S156), the terminal device 120 ends the live play of the video indicated by the real-time video data (step S157), and the terminal device 120 ends the operation. In this example, when a touch operation is received on a button for returning to the start screen from the live-play screen displayed on the touch screen of the terminal device 120, live-play of the video represented by the live-video data ends.
On the other hand, when the terminal device 120 does not receive an operation to end live play of the video indicated by the real-time video data (no in step S156), and when the terminal device 120 does not receive an input operation to the first button (no in step S158), the process returns to the determination in step S156.
On the other hand, when the terminal device 120 does not receive an operation to end live play of the video indicated by the real-time video data (no in step S156), and when the terminal device 120 receives an input operation to the first button (yes in step S158), the terminal device 120 controls start/stop of recording of the real-time video data in live play (step S159). In this example, when a touch operation is received on the recording control button, the control is started when recording is not started, and stopped when recording is started. In this example, after the live play of the video represented by the real-time video data is started, if the input operation to the first button is received for the first time, the control is performed so that the recording of the video data is started.
Next, when recording of the real-time video data during live broadcasting is started in step S159 (yes in step S160), the terminal device 120 starts a recording process of recording the real-time video data as recorded video data in the terminal-side recording means (step S161), and returns to the determination in step S156.
On the other hand, when the recording of the real-time video data in live broadcasting is stopped in step S159 (no in step S160), the terminal device 120 stops the recording process of recording the real-time video data as recorded video data in the terminal-side recording means (step S162), and returns to the determination in step S156.
In the above, an example of the flow of the processing performed by the medical examination apparatus 110 and the terminal apparatus 120 is described.
Fig. 5 to 9 are explanatory views showing examples of display screens of the terminal device 20 according to at least one embodiment of the present invention.
Fig. 5 is a diagram showing an example of a display screen of the terminal device 120 in a case where the medical application is started up on the terminal device 120. Fig. 5 shows a display screen A1 as a start screen of the medical application. In the display screen A1, which is a screen immediately after the medical application is started, a list of identification information of the subject in which recorded video data is recorded in the terminal-side recording means of the terminal device 120 is displayed.
In the present application, the operation of recording the video data of the subject captured by the imaging camera included in the medical examination apparatus 110 as recorded video data, which is managed by dividing the video data by folders for each piece of management subject identification information B1, can be performed, and the examination time information (examination start time information) C and the imaging start button D1 are displayed on the display screen A1. The inspection time information C is information indicating an inspection time corresponding to each piece of the management subject identification information B1. The shooting start button D1 is a button for displaying a video represented by real-time video data by live broadcasting on a terminal display unit. In addition, the recorded video data or still image data divided into at least one folder may be transmitted to the server apparatus 130 according to a selection operation performed on the display screen A1.
Fig. 6 is a diagram showing an example of a display screen of the terminal device 120 when receiving an input operation to the shooting start button D1 on the display screen A1. Fig. 6 (a) and 6 (B) show a display screen A2 and a display screen A3 as video display preparation screens of medical applications. On the display screen A2, a message F1 prompting the input of identification information, an input form E of identification information, and an input completion button D2 of identification information are superimposed and displayed. The input of the identification information to the input table E is performed by using the soft key in display on the display screen A1.
On the display screen A3, the identification information B2, the confirmation message F2 for inputting the identification information, the button D3 for inputting the identification information again, and the button D4 for starting shooting by considering that the input of the identification information is not problematic are superimposed and displayed. The input identification information B2 is information input in the input table E. The input identification information B2 is the same as the information that the video data has been managed as the management subject identification information B1. Here, when the button D4 is touched, live play of the video indicated by the real-time video data starts. In fig. 6, the input of the identification information is required to identify the subject who is photographing the eye to be inspected, but for example, when the need for management based on the identification information is relatively low in the case where the inspection object is an animal, the input of the identification information is not necessarily required as in fig. 6.
Fig. 7 is a diagram showing an example of a display screen of the terminal device 120 when an input operation of the button D4 for starting shooting is received on the display screen A3. A display screen A4 and a display screen A5, which are live-action screens of medical applications, are displayed in fig. 7 (a) and 7 (B). On the display screen A4 and the display screen A5, the subject identification information B3, the video G indicated by the real-time video data, the recording control button D5, the right eye selection button D6, the left eye selection button D7, and the return button D8 are displayed.
The recording control button D5 is an example of a first button, and is a switching button for controlling start/stop of recording of real-time video data in live play. When the recording control button D5 is touch-operated on the display screen A4, recording to the terminal-side recording unit is started, and the process shifts to the display screen A5. On the other hand, when the recording control button D5 is touch-operated on the display screen A5, recording to the terminal-side recording unit is stopped, and the process shifts to the display screen A4. The recording control button D5 changes the display morphology in response to control of start/stop of recording of real-time video data.
The right-eye selection button D6 and the left-eye selection button D7 are examples of the third button, and are buttons for selecting which of the left and right eyes of the subject is the subject. The right eye is selected when the right eye selection button D6 labeled "R" is touched, and the left eye is selected when the left eye selection button D7 labeled "L" is touched. When either one of the right-eye selection button D6 and the left-eye selection button D7 is selected, the right-eye selection button D6 and the left-eye selection button D7 are displayed in different forms, and the user can confirm which one is selected.
In the example of fig. 7 (a) and 7 (B), the right eye selection button D6 is operated to select the right eye. When the button corresponding to the selected one of the eyes is selected again for the right-eye selection button D6 and the left-eye selection button D7, the selected state is released, and the left-eye and right-eye selection buttons are not selected. When a button corresponding to one eye is selected when the other eye has been selected, the selected state of one selected eye is released, and the other selected state is changed. When the right eye selection button D6 or the left eye selection button D7 is selected, the video G is recorded in the terminal-side recording unit in correspondence with the left and right selection information of the eye to be inspected during the period from the start to the stop of the recording of the real-time video data.
The back button D8 is a button for ending live play of the video G indicated by the real-time video data and shifting to the display screen A1 as the start screen.
Fig. 8 is a diagram showing an example of a display screen of the terminal device 120 when receiving a selection operation of any one of the management subject identification information B1 on the display screen A1. Fig. 8 shows a display screen A6 as a recorded data list screen of the medical application. On the display screen A6, the subject identification information B3, the video thumbnail J, the still image thumbnails H1 to H3, and the left-right selection information K1 to K4 are displayed.
The video thumbnail J is an image showing the content of the video represented by the recorded video data. In this example, video thumbnail J is displayed for each recorded video data, and each video thumbnail J is a still image at an arbitrary instant of a video corresponding to itself. Still image thumbnails H1 to H3 are still images corresponding to still image data (recorded still image data) recorded in the terminal-side recording unit.
Here, the recorded still image data refers to data of a still image cut out from a video represented by the recorded video data and recorded in the terminal-side recording unit. An example of a display screen for cutting out a still image from a video will be described later. When a touch operation is performed on the video thumbnail J and the still image thumbnails H1 to H3, video playback or still image display is performed.
The left-right selection information K1 to K4 are pieces of left-right selection information of the eye to be inspected corresponding to each recorded video data or recorded still image data. The video thumbnail J and the still image thumbnails H1 to H3 are shown as video data or still image data of one of the left and right eyes to be inspected by the left and right selection information K1 to K4, respectively. "R" or "L" indicated by the left-right selection information K1 to K4 is determined by a touch operation performed on the right-eye selection button D6 and the left-eye selection button D7 shown in fig. 7.
In addition, in the display screen A6, at least one of the recorded video data and the recorded still image data may be transmitted to the server apparatus 130 according to a predetermined operation. For example, in the display screen A6, when a touch operation is performed on the video thumbnail J and the still image thumbnails H1 to H3, a button for selecting either the playback of the video or the display of the still image or the transfer of data is superimposed and displayed, and the selected operation is performed. Here, the data may be transmitted by transmitting a plurality of pieces of recorded video data and/or recorded still image data to the server apparatus 130.
Fig. 9 is a diagram showing an example of a display screen of the terminal device 120 when receiving a selection operation of the video thumbnail J on the display screen A6. Fig. 9 shows a display screen A7 as a video playback screen of the medical application. On the display screen A7, a video L indicated by the recorded video data, the subject identification information B3, the cut button D9, and the return button D10 are displayed. The video L indicated by the recorded video data corresponds to the video thumbnail J subjected to the touch operation on the display screen A6. The back button D10 is a button for ending the playback of the video L and shifting to the display screen A6, which is a recorded data list screen.
The cut button D9 is an example of a second button, and is a button for making an extraction request for extracting a still image from video. When the cut button D9 is touched, still image data corresponding to the still image displayed on the display screen A7 among the video L is recorded in the terminal-side recording unit.
In addition, in the display screen A7, a frame to be displayed among a plurality of frames in the video L can be adjusted by a prescribed operation. For example, when a touch operation is performed on the left or right side (or an icon displayed on the left or right side) of the video L display area in the display screen A7, frame-by-frame fallback or frame-by-frame progression is performed. In addition, when a touch operation is performed on the central portion of the video L display area (or an icon displayed on the central portion) in the display screen A7, a temporary stop or restart of the playback of the video L is performed.
In addition, in the display screen A7, the play range of the video L can be changed by a predetermined operation. For example, when an enlargement operation or a reduction operation is performed on the display area of the video L in the display screen A7, the enlargement display or the reduction display of the video L is performed. The reduced display here means an enlarged display of the reduced video L. In the case of performing enlarged display of the video L, when the video L display area is slid, the enlarged displayed portion changes among the images displayed in the video L display area.
When the cut button D9 is touched in the case where the enlarged display of the video L is performed, still image data corresponding to a still image of the enlarged display portion in the video L is recorded in the terminal-side recording unit. That is, regardless of whether or not the enlarged display of the video L is present, when the cut button D9 is touched, still image data corresponding to the displayed still image is recorded in the terminal-side recording unit.
As described above, according to the medical program of the present invention, which realizes various processes in the terminal device 120 used with the medical examination device 110 having the imaging camera, various data including video data (real-time video data) transmitted in real time while being imaged by the medical examination device 110 is received, and the video represented by the received real-time video data is displayed by the terminal display unit included in the terminal device 120, and the start/stop of the recording of the real-time video data in the live play is controlled in accordance with the predetermined operation performed on the terminal device 120, and the real-time video data during the period from the start of the recording until the stop is recorded in the terminal-side recording means as the recorded video data, so that the play and the recording of the video data generated by the imaging of the medical examination device 110 can be performed while maintaining the sense of use of the medical examination device 110.
That is, by performing an operation on the display screen of the application program executed by the terminal device 120, live video can be played or recorded, and live video can be played or recorded without changing the configuration of the device such as the sense of use of the medical examination device 110.
Second embodiment
Conventionally, still image data is extracted from video data and recorded. The reason for recording still image data is that a still image of a specific frame among video data is desired to be used for a specific purpose. For example, there are the following devices: the video generated by photographing the eye to be inspected of the patient by the medical photographing device is temporarily stored, and the video is displayed, and still image data of a frame selected by the user for diagnosis or the like is recorded in a recording unit provided in the device.
Here, in the above-described apparatus, since it is necessary for the user to select desired still image data in order to record the still image data in the recording unit, there is a problem that the work load of the user is large. On the other hand, if video data is stored in a storage device such as an HDD (Hard Disk Drive) or an SSD (solid state Drive) without being changed, it takes time to extract still image data, but the capacity of video data is extremely large compared with that of still image data, and therefore the recording-enabled remaining capacity of a recording unit provided in the apparatus becomes too small, and there is a problem that: in consideration of the remaining capacity, a user is caused to put a burden on management of video data or the like.
Hereinafter, an example of the image processing apparatus 220 according to the embodiment of the present invention will be described with reference to the drawings.
Fig. 10 is a block diagram showing an example of a functional configuration of the image processing apparatus 220 according to at least one embodiment of the present invention.
The image processing apparatus 220 is an example of a computer of the present invention. The image processing apparatus 220 acquires video data of a processing object, and calculates a degree of focusing of frames constituting a part or all of the acquired video data. Then, the image processing apparatus 220 extracts, as extracted still image data, a part or all of frames (candidate frames) having a degree of focus equal to or higher than a predetermined value among the frames having a degree of focus calculated, and records the extracted still image data in a recording unit included in the apparatus. Here, the focusing degree means a degree to which an object mapped in a frame is focused. The feature of the present invention is to automatically extract still image data on the premise that the image data having a large degree of focus is highly likely to be an image desired by the user.
As shown in fig. 10, the image processing apparatus 220 includes a video data acquisition unit 221, a focus degree calculation unit 222, a still image extraction unit 223, a display control unit 224, a reception unit 225, a recording unit 226, and a deletion unit 227.
The video data acquisition unit 221 has a function of acquiring video data of a processing target. The video data of the processing target is not particularly limited, but captured video data is preferable. In this example, the video data is video data transmitted from an external device, and the video data acquisition unit 221 acquires the video data by receiving the video data. As described later, the video data captured by the ophthalmologic imaging apparatus is assumed to be one of the processing targets, but the image processing apparatus 220 of this example is applicable not only to the ophthalmologic field but also to video data in a wide range of fields.
The video data acquisition unit 221 causes the acquired video data to be stored in a storage unit included in the computer. The storage unit here is a unit that temporarily holds video data. For example, the memory unit is realized by a memory. The storage unit may be realized by processing the video data as a temporary file. The video data stored in the storage unit is deleted, for example, triggered by the end of an application program or the like associated with extraction of the still image data. In the case where video data is handled as a temporary file, the video data may be held in a storage device such as an HDD or SSD.
The focusing degree calculating unit 222 has a function of calculating the focusing degree of a frame constituting a part or all of the video data.
Here, the degree of focusing is calculated for each frame to be calculated. The configuration of the frame for determining the calculated degree of focus among the frames constituting the video data is not particularly limited. In addition, it is preferable to calculate from the amount of edges extracted from the still image data. The calculated degree of focusing is used as a focusing score by the still image extracting section 223 described later.
The calculation of the degree of focus in the degree of focus calculation unit 222 may be realized by an edge extraction image generation process of generating an edge extraction image by performing an edge extraction process on frames constituting video data, and a calculation process of calculating the degree of focus of frames corresponding to the edge extraction image from the distribution of edges in the edge extraction image.
Here, the edge extraction processing means processing of extracting a portion (a portion in which the luminance value does not continuously change) in which the luminance value changes significantly within the image as an edge. The edge extraction process may also use a laplacian filter to perform the edge extraction process. The laplace filter here is, for example, a 3×3 8-direction laplace filter. The edge extraction process may be performed using any of various known techniques as long as the edge extraction process can be performed. The calculation process is not particularly limited as long as it is performed based on the amount of edges in the edge extraction image, and as an example, a process of taking the distribution of edges in the edge extraction image as the degree of focusing of a frame corresponding to the edge extraction image may be considered.
The focusing degree calculation unit 222 may perform a green component image generation process of generating green component image data by extracting a green component in color information of each pixel constituting a frame and a gradation process of generating gradation image data by gradation of the green component image data, and calculate the degree of focusing from the gradation image data obtained by the gradation process.
The Green component herein means a component of Green in three primary colors used in RGB (Red Green Blue), which is one of expression forms of an image. The green component image is an image obtained by removing the red component and the blue component from the still image of the frame, out of the three primary colors used for RGB. In addition, graying means making the colors used in the image white, black, and at least one level of gray. In general, only luminance information is extracted from a color image, and the expression corresponding to the gradation range of luminance changes stepwise from black to white.
For example, when video data is data of a video obtained by capturing an object under blue background illumination, a green component image in which a green component is extracted is generated and the image is made gray-scale, so that an image having a higher contrast is generated than in the case of not making gray-scale, and therefore a frame having a large amount of focus can be extracted more accurately. In addition, when the background illumination is not blue video data, even when a green component image is generated, the calculation result of the degree of focus is not greatly affected. Therefore, as long as the configuration for generating the green component image is adopted, the focusing degree calculation with sufficient accuracy can be performed irrespective of the color of the background illumination.
In particular in the examination of the ophthalmic field, blue background illumination is used in combination with fluorescein staining. With this combination, an image obtained by photographing the eye to be inspected is more blue component and green component than red component. When the edge extraction processing is performed on frames constituting the video data thus captured as they are, the edge cannot be smoothly extracted, and as a result, a sufficient degree of focusing may not be calculated. Here, the portion actually intended to be observed in the combination of blue background illumination and fluorescein staining is a portion of the green component. Therefore, by adopting a configuration in which a green component image is generated and gradation is achieved in the focusing degree calculation, the focusing degree of a frame focused on a portion to be observed can be calculated to be large. Here, since a Barrier Filter (barrer Filter) is applied to a combination of blue background illumination and fluorescein staining to remove the blue component of the captured video data, there is no problem in calculation of the focusing power even if a green component image in which the green component is extracted is also used. Even when the edge extraction processing is performed by generating a green component image for the video data of the eye to be inspected, which is captured by the background illumination of other colors, the result of the focusing degree calculation is not greatly affected. Therefore, a configuration for uniformly generating a green component image can be adopted.
The focusing degree calculation unit 222 may perform adjustment processing for adjusting the black level of the gradation image data, and calculate the focusing degree from the gradation image data of which the black level has been adjusted by the adjustment processing.
Here, the degree of adjustment of the black level is appropriately determined according to the type of the subject of the video. By adopting the configuration of adjusting the black level, noise of the gradation image can be reduced, and the influence of noise in the focusing degree calculation can be reduced, with the result that the focusing degree can be calculated more accurately.
The focusing degree calculation unit 222 may set 1 frame at predetermined time intervals or at predetermined frame numbers among frames constituting the video data as a target frame for calculating the focusing degree. The predetermined time interval and the predetermined number of frames are appropriately determined, for example, in accordance with the processing performance of the image processing apparatus 220, the specification of software, and the like. By adopting such a configuration, the processing time of the image processing apparatus 220 can be shortened.
In the case where the video data is data of a video obtained by capturing an eye of the subject, the focusing degree calculation unit 222 may calculate the focusing degree with respect to a region from which a predetermined range of the upper side and/or the lower side of the frame constituting the video data is removed.
For example, in calculation of the degree of focus of video data obtained by photographing the eyes of a subject, although a region of a predetermined range on the upper side and/or the lower side of a frame is not used in diagnosis, the result of calculation of the degree of focus is affected by the region. For example, the upper region of the frame that is not used for diagnosis is the eyelash region of the subject. In contrast, by removing the region including the portion that is not used at the time of diagnosis from the object of the focus degree calculation, the focus degree of the frame focused on the region of the usable range at the time of diagnosis is calculated to be large. Further, by defining the region to be the object of the focus degree calculation, the processing time of the image processing apparatus 220 is reduced. The predetermined range of the upper side and/or the lower side of the frame is appropriately determined, and is, for example, a quarter of the area of each of the upper side and the lower side of the frame.
The still image extraction unit 223 has a function of extracting, as extracted still image data, a part or all of frames (candidate frames) having a degree of focus equal to or higher than a predetermined value among the frames having a degree of focus calculated. Here, the predetermined value is a threshold value of the predetermined focusing power. The threshold is set appropriately by, for example, combining the content of the object in the video data and the upper limit number related to the extraction.
In addition, when a plurality of candidate frames exist within a predetermined time range when the candidate frames are arranged in time series, the still image extraction unit 223 may exclude a predetermined number of candidate frames from the extraction target from the candidate frames having a small focusing degree.
The predetermined time range is appropriately determined. In addition, the removal of the predetermined number of candidate frames from the extraction target means that the predetermined number of candidate frames is not handled as the extraction still image data. By performing the extraction target exclusion within a predetermined time range, the number of candidate frames extracted within a specific time range can be suppressed, and the total capacity of extracted still image data recorded in the recording unit can be reduced. As an example, in a case where five candidate frames exist within a predetermined time range, if two candidate frames are set to be excluded from the extraction target from the candidate frames having a small degree of focusing, two frames having a small degree of focusing are excluded from the extraction target, and three candidate frames remain for the time range.
In addition, when extracting still image data from candidate frames, the still image extracting unit 223 may exclude candidate frames from the extraction target when the candidate frames are sequentially focused in time order, and when there are candidate frames having a smaller degree of focusing than the focused frame within a predetermined time range from the time point of the focused frame.
Here, sequentially focusing on candidate frames in time series means that the candidate frames are determined as focused frames in time series. The time sequence may be from front to back in time series or from back to front in time series. The configuration for determining the predetermined time range from the time point of the frame of interest is not particularly limited, and may be a configuration for determining a time point before the predetermined time range from the time point of the frame of interest, or may be a configuration for determining a time point after the predetermined time range from the time point of the frame of interest.
For example, when video data of a processing object is data of video obtained by continuously capturing a plurality of objects (for example, right and left eyes of an examinee), there are cases where the average value of the magnitudes of the focal powers of the frames is calculated for each object, and the average value is different depending on the object. That is, in the same video data, there is a case where the image pickup object changes in the region where the frames having a large degree of focusing are collected and the region where the frames having a slightly smaller degree of focusing are collected, and the distribution of the degree of focusing is different. In such a situation, when candidate frames are sequentially selected from candidate frames having a large degree of focusing and extraction processing is performed until a predetermined upper limit number is reached, a subject in which a sufficient number of frames cannot be extracted occurs, and a situation occurs in which extraction of still image data for each subject is not sufficiently obtained. In contrast, when a plurality of candidate frames exist within a predetermined time range from the time point of the frame of interest, by performing processing to exclude a part of the plurality of candidate frames within the predetermined time range from the extraction target, it is possible to avoid a situation in which extraction of still image data is performed only from a part of the imaging target, and as a result, the possibility that candidate frames corresponding to other imaging targets remain as extraction targets increases. That is, it can be said that the possibility of obtaining extracted still image data for each subject without omission becomes high.
In addition, the still image extraction unit 223 may exclude a frame having the smallest focusing degree among the candidate frames from the extraction target so that the total number of extracted still image data does not exceed a predetermined upper limit number when extracting still image data from the candidate frames. Here, the total number of still image data extracted is not particularly limited, and is preferably determined based on the recordable capacity of the recording unit of the image processing apparatus 220, and the like. The configuration for making the total count not exceed the predetermined upper limit number is not particularly limited, and the total count may be reduced in order to avoid the state in which the total count exceeds the upper limit number, or may be reduced in the case in which the total count reaches a predetermined threshold value smaller than the upper limit number. By adopting such a configuration, the user can reduce the burden of managing the recording capacity of extracting still image data.
The display control unit 224 has a function of causing the display unit of the image processing apparatus 220 to display the extracted still image represented by the extracted still image data. Examples of display sections are touch screen displays.
In addition, the display control unit 224 may cause the display unit to display a video represented by video data including the extracted still image when a predetermined first input operation is performed. Examples of the first input operation include an operation of selecting an extracted still image represented by the extracted still image data displayed on the display unit, a thumbnail image of the extracted still image, and a predetermined video play button.
In addition, the display control unit 224 may continue to play the video data after the extracted still image data when a predetermined second input operation is performed on any one of the extracted still images indicated by the extracted still image data. Examples of the second input operation are an operation of selecting an extracted still image represented by the extracted still image data or a thumbnail image of the extracted still image, or an operation of selecting a predetermined video play button when the extracted still image is displayed on the display section. With such a configuration, the user can check whether or not there is a frame to be recorded before or after the point in time of extracting the frame corresponding to the still image data.
The reception unit 225 has a function of receiving a selection operation from the user for extracting a still image displayed on the display unit. The receiving unit 225 may receive a frame selection operation for a displayed video. Examples of the selection operation of the frame performed for extracting the still image or the displayed video include an operation of selecting a button corresponding to the frame with respect to extracting the still image or the displayed video.
The recording unit 226 has a function of causing the extracted still image data to be recorded in a recording unit provided in the image processing apparatus 220. Here, the recording unit is a unit for holding data for a long time, unlike the above-described storage unit. The recording unit is realized by a storage device such as an HDD or SSD, for example.
In addition, when a selection operation is received for the extracted still image displayed on the display unit, the recording unit 226 may record, in the recording unit, extracted still image data indicating the extracted still image for which the selection operation was received. By recording the extracted still image data representing the extracted still image selected by the user in this way, it is possible to suppress an increase in the total capacity of the recorded extracted still image data.
In addition, when a selection operation of a frame is received for a video displayed on the display unit, the recording unit 226 may record still image data indicating the frame for which the selection operation was received in the recording unit. Even in the case where the image to be recorded is not present in the extracted still image, the still image data of the frame desired by the user can be recorded in the recording unit by playing the video in accordance with the operation from the user.
The deletion unit 227 has a function of deleting video data stored in the storage unit at a predetermined timing (first timing) after the still image extraction unit 223 has extracted still image data.
The first timing is not particularly limited as long as it is after extraction of the still image data, and may be a timing at which the still image extraction unit 223 extracts the still image data, or a timing at which the recording unit 226 records the extracted still image data. The first timing may be a timing when a predetermined time has elapsed after the still image data is extracted, or may be a timing when an application program related to the extraction of the still image data ends.
In the above, an example of the functional configuration of the image processing apparatus 220 according to at least one embodiment of the present invention is described. In the example described with reference to fig. 10, the video data acquired by the video data acquisition unit 221 is deleted from the storage unit by the deletion unit 227 at the first timing after the still image data is extracted, but the timing of deleting the video data is not limited to this. Even when still image data is not extracted by the still image extracting section 223, the deleting section 227 may delete the video data stored in the storage unit at a predetermined timing (second timing) after the video data is stored in the storage unit. The condition of the second timing is not particularly limited as long as it is after the video data is stored in the storage unit, but for example, it is considered that the processing of copying the video data is performed on a recording unit that can be recorded for a long time as a trigger of the second timing, considering that the user wants to retain the video data stored in the temporary storage unit. In the case where it can be confirmed that video data is copied to a storage device such as HDD or SSD that can be recorded for a long time, the video data is deleted from the temporary storage unit because the necessity of retaining the video data in the temporary storage unit is lost.
Fig. 11 is an explanatory diagram for explaining an example of an environment to which the image processing apparatus 220 according to at least one embodiment of the present invention is applied. Fig. 11 shows a medical examination apparatus 210, an image processing apparatus 220, and a server apparatus 230.
The medical examination apparatus 210 is an apparatus for performing examination on an examination subject. The examination object is, for example, a human or an animal. The medical examination device 210 has at least a camera for photographing and a function of photographing an examination object to generate video data. The medical examination device 210 has a communication function, and in this example, performs wireless communication with the image processing device 220. The medical examination device 210 transmits video data generated by imaging to the image processing device 220. In the case where the medical examination apparatus 210 and the image processing apparatus 220 perform wired communication, the medical examination apparatus 210 and the image processing apparatus 220 are connected to each other by, for example, a USB cable or the like. In the example shown in fig. 11, the device that performs video capturing is the medical examination device 210, but the device is not limited to this as long as it can perform video capturing.
In this example, the medical examination apparatus 210 is a hand-held apparatus called a slit lamp, which irradiates an eye to be examined with slit light, and inspects a cornea, a crystalline lens, or the like of the eye to be examined by observing scattered light generated by scattering in the eye to be examined. The medical examination device 210 is not limited to the above example, and may be a handheld medical examination device such as a handheld fundus camera or a stationary medical device such as a dry eye examination device, in addition to a handheld slit lamp.
The image processing apparatus (terminal apparatus) 220 in fig. 11 is an apparatus used together with the medical examination apparatus 210. In fig. 11, the terminal device functions as an image processing device 220.
The image processing apparatus 220 may be an apparatus designed as a dedicated device, but may also be implemented by a general computer. That is, the image processing apparatus 220 includes at least a CPU and a memory that are typically provided in a general computer. The processing of the image processing apparatus 220 is realized by reading a program for executing the processing from a memory and executing it in a CPU or GPU functioning as a control circuit. In other words, the constitution is as follows: by execution of the program, the processor can execute the respective processes of the respective devices. Hereinafter, the image processing apparatus 220 may be any apparatus capable of executing processing equivalent to a computer, and may be implemented by a smart phone, a tablet terminal, or the like, for example. The image processing device 220 is in this example a tablet terminal.
The image processing apparatus 220 has a function of communicating with the medical examination apparatus 210 and the server apparatus 230. Specifically, the image processing apparatus 220 performs wireless communication with the medical examination apparatus 210, performs wired or wireless communication with the server apparatus 230, and performs transmission or reception of various information.
The image processing apparatus 220 further includes a display unit, and various information is displayed on the display unit. Specifically, the image processing apparatus 220 displays a video represented by the extracted still image or video data on the display unit. In this example, the image processing apparatus 220 as a tablet terminal has a touch panel functioning as a display unit, and various information including a still image extracted and a video represented by video data is displayed on the touch panel, and various input operations from a user are received by a touch operation performed on the touch panel.
The server device 230 is a device for managing various information such as video data transmitted from the image processing device 220. The server device 230 has a server-side recording unit that records video data or still image data received from the image processing device 220. The data recorded in the server-side recording unit may be classified and managed based on the identification information. In this example, the server apparatus 230 causes the extracted still image data to be recorded in the server-side recording unit provided in the server apparatus 230 in correspondence with the electronic medical record corresponding to the subject, based on the identification information corresponding to the extracted still image. In addition, the server-side recording unit that records the extracted still image data transmitted from the image processing apparatus 220 is not particularly limited. Other examples of the server-side recording unit that records the still image data transmitted from the image processing apparatus 220 are recording units belonging to the same network as the server apparatus 230.
The communication between the server device 230 and the image processing device 220 may be performed by wired connection such as a USB cable, or may be performed by wireless using a wireless LAN or the like. The server device 230 is a device for managing electronic medical records and the like provided in an in-hospital LAN of a hospital, for example.
Fig. 12 is a flowchart showing an example of a flow of frame extraction processing performed by the image processing apparatus 220 according to at least one embodiment of the present invention.
In fig. 12, the extraction process of the image processing apparatus 220 starts by extracting a frame of a processing object from video data (step S201). In this example, the extraction processing by the image processing apparatus 220 is started by extracting one frame as a frame to be processed from a plurality of frames constituting video data at predetermined time intervals.
Next, the image processing apparatus 220 selects any one of the frames to be processed extracted in step S201 (step S202). In this example, the image processing apparatus 220 selects one frame at the earliest point in time at which the frames to be processed are not selected when they are arranged in chronological order.
Next, the image processing apparatus 220 calculates the degree of focusing of the frame selected in step S202 (step S203). In this example, the image processing apparatus 220 generates an edge extraction image by performing edge extraction processing on the frame selected in step S202 using a laplacian filter, and calculates the degree of focus of the frame corresponding to the edge extraction image from the distribution of edges in the edge extraction image.
Next, when the degree of focusing calculated in step S203 is not equal to or higher than the predetermined value (no in step S204), the image processing apparatus 220 proceeds to step S211.
On the other hand, when the degree of focus calculated in step S203 is equal to or greater than the predetermined value (yes in step S204), the image processing apparatus 220 searches for the minimum value of the degree of focus from among the candidate frames within the predetermined time range (step S205). In this example, the image processing apparatus 220 searches for a frame having the smallest focusing power, which exists within a predetermined time range, among the candidate frames added to the candidate frame list.
Next, when the degree of focusing of the frame selected in step S202 is smaller than the minimum value of the degree of focusing searched in step S205 (no in step S206), the image processing apparatus 220 proceeds to step S208.
On the other hand, when the degree of focusing of the frame selected in step S202 is larger than the minimum value of the degree of focusing searched in step S205 (yes in step S206), the image processing apparatus 220 excludes the candidate frame as the minimum degree of focusing searched in step S205 from the extraction target (step S207). In this example, the image processing apparatus 220 deletes the candidate frame, which is the smallest amount of focus retrieved in step S205, from the candidate frame list.
Next, when the total number of candidate frames does not exceed the upper limit number (no in step S208), the image processing apparatus 220 proceeds to step S210.
On the other hand, when the total number of candidate frames exceeds the upper limit number (yes in step S208), the image processing apparatus 220 excludes the frame having the smallest focusing degree among the candidate frames from the extraction target (step S209). In this example, the image processing apparatus 220 deletes the frame with the smallest focusing degree among the candidate frames from the candidate frame list.
Next, the image processing apparatus 220 adds the frame under selection as a candidate frame (step S210). In this example, the image processing apparatus 220 adds information for determining the frame under selection to the candidate frame list.
Next, if the selected frame is not the last selected frame among the frames to be processed (no in step S211), the image processing apparatus 220 returns to step S202.
On the other hand, when the selected frame is the last selected frame among the frames to be processed (yes in step S211), the image processing apparatus 220 ends the frame extraction process.
In the above, an example of the flow of the frame extraction processing performed by the image processing apparatus 220 is described. In the example shown in fig. 12, the image processing apparatus 220 performs a series of processes from the time when the degree of aggregation is calculated for one frame until the time when the candidate frame is removed and the additional process is determined. For example, the image processing apparatus 220 may be configured to perform the calculation processing of the degree of aggregation for each of the plurality of frames, and then perform the processing of removing and adding the candidate frames.
Fig. 13 is a flowchart showing an example of a flow of a frame focusing degree calculation process performed by the image processing apparatus 220 according to at least one embodiment of the present invention. In fig. 13, an example of a detailed flow of the focusing degree calculation processing of step S203 in fig. 12 will be described.
In fig. 13, the focusing degree calculation processing by the image processing apparatus 220 starts by acquiring a frame of a processing object from video data (step S251). In this example, the focusing degree calculation processing by the image processing apparatus 220 starts by acquiring the frame selected in step S202 of fig. 12 among the frames constituting the video data.
Next, the image processing apparatus 220 generates green component image data from which a green component among color information of each pixel constituting the selected frame is extracted (step S252). In this example, the image processing apparatus 220 generates green component image data of only the green component from which the red component and the blue component of the frame are removed.
Next, the image processing apparatus 220 generates gradation image data by gradation of the green component image data (step S253).
Next, the image processing apparatus 220 adjusts the black level of the gradation image data (step S254). In this example, the image processing apparatus 220 adjusts the black level of the gradation image data to eliminate noise on the image.
Next, the image processing apparatus 220 performs edge extraction processing on the gradation image data of which the black level is adjusted to generate an edge extraction image (step S255). In this example, the image processing apparatus 220 performs an edge extraction process using a laplacian filter on the gradation image data of which the black level is adjusted, and generates an edge extraction image.
Next, the image processing device 220 calculates the degree of focus of the edge extraction image (step S256), and ends the degree of focus calculation process. In this example, the image processing apparatus 220 performs the following processing: the distribution of the edges in the edge extraction image is calculated as the degree of focus of the frame corresponding to the edge extraction image, and then the degree of focus calculation processing is ended. In this example, the calculated degree of focus is handled as a focus score.
In the above, an example of the flow of the frame focusing degree calculation processing performed by the image processing apparatus 220 is described.
Fig. 14 is an explanatory diagram for explaining an example of management of video data and still image data of the image processing apparatus 10-1 according to at least one embodiment of the present invention.
In the example shown in fig. 14, the image processing apparatus 220 manages video data by a temporary folder, and manages extraction of still image data by a still image folder. Fig. 14 shows a flow of capturing an eye of a subject during an examination of the eye of the subject, selecting a focused frame, and storing the frame. Here, a job from the time when the eyes of the subject are photographed until the focused frame is recorded in the recording unit of the image processing apparatus 220 is referred to as "inspection".
First, the image capturing screen shown in fig. 14 is a screen showing a state captured by the medical examination device 210 during examination of the subject, and is a screen received in real time by the image processing device 220 and displayed on the display unit. The image processing apparatus 220 saves the received video data as a temporary file in a temporary folder. Then, the image processing apparatus 220 calculates the degree of focus for the frames constituting the video data stored in the temporary folder, and processes the frames having the degree of focus equal to or higher than the predetermined value as extracted still image data.
The extracted still image list screen shown in fig. 14 is a screen in which a list of extracted still images of extracted still image data is displayed after the eyes of the subject are photographed. Here, the user performs a touch selection operation on the extracted still image to be saved in the extracted still image list screen. The image processing apparatus 220 records extracted still image data representing the extracted still image selected by the user in a still image folder. The image processing apparatus 220 deletes the video data from the temporary folder with the recording of the extracted still image data into the still image folder as a trigger.
The video playback screen shown in fig. 14 is a screen for playing back video data in response to a predetermined operation performed in the still image extraction list screen. For example, when a video playback operation is performed on an extracted still image displayed on the extracted still image list screen, the image processing apparatus 220 continues to play a video after the extracted still image. Then, in the case where a predetermined frame designating operation is performed in the video play screen, still image data of the designated frame may be recorded in the still image folder.
By managing the video data and the still image data in this manner, it is possible to reduce the capacity of data stored in the image processing apparatus 220 while retaining the still image data desired by the user.
The above description has been made regarding an example of management of video data and still image data by the image processing apparatus 220. In the example described with reference to fig. 14, the processing of deleting the extracted still image data from the recording of the still image data in the temporary folder is performed as a trigger after the video data is stored in the temporary folder, but the method of managing the video data is not limited to this. For example, when receiving a recording instruction operation of video data stored in a temporary folder by a user, the image processing apparatus 220 may be configured to record the video data in a long-stored video recording folder (not shown). By adopting such a configuration, the user can be provided with an option to retain the video data without deleting the video data even after the end of the inspection, and the user can manage the video data or the still image data more flexibly.
Fig. 15 is an explanatory diagram illustrating an example of extraction processing performed by the image processing apparatus 220 according to at least one embodiment of the present invention to extract still image data from candidate frames. Fig. 15 shows a graph of a change in the degree of focus when candidate frames in video data are arranged in chronological order.
In the example shown in fig. 15, the left eye of the subject is photographed first, and then the right eye of the subject is photographed. In the graph shown in fig. 15, there is one at the left-eye imaging side at a portion with a large focusing degree, and there is one at the left-eye imaging side, and a portion with a small focusing degree is generated at a time in between. This is because the degree of focusing temporarily decreases when the imaging object is shifted from the left-eye imaging to the right-eye imaging. Comparing the time range of photographing the left eye with the time range of photographing the right eye, the frame of the right eye portion is calculated to have a higher average degree of focusing than the frame of the left eye portion.
Here, if only a frame having a high degree of aggregation is extracted from among the entire plurality of frames constituting video data, the following situation occurs: the frame of the right eye portion becomes an extraction object, and on the other hand, the frame of the left eye portion is not included in the extraction object. In order to make the frame of the left eye part also an extraction object, a plurality of frames constituting video data are sequentially focused in time sequence, and when a frame having a focusing power smaller than that of the focused frame exists in a range from a time point of the focused frame to a predetermined time (for example, 1 second) before the focused frame, the frame is excluded from the extraction object. By doing so, there is an increased possibility that frames not biased toward the right eye portion and frames in the left eye portion remain as extraction targets. Thus, an increase in the total capacity of the recorded extracted still image data can be suppressed.
In the above, an extraction processing example in which the image processing apparatus 220 extracts still image data from among candidate frames is described.
As described above, according to the image processing program of the present invention that causes a computer (for example, the image processing apparatus 220) to realize various functions, since the computer is caused to realize: a video data acquisition unit 221 that acquires video data of a processing target; a focusing degree calculating unit 222 that calculates a focusing degree of a frame constituting a part or all of the video data; a still image extraction unit 223 that extracts, as extracted still image data, a part or all of frames having a degree of focus equal to or higher than a predetermined value, from among the frames having a degree of focus calculated; a recording unit 226 for recording the extracted still image data in a recording unit provided in the computer; therefore, still image data can be automatically extracted from video data and recorded.
[ additionally remembered ]
The description of the first embodiment above is described so that at least the following invention can be implemented by those skilled in the art to which the invention pertains.
[1]
A medical program for realizing various processes in a terminal device used with a medical examination device having a photographing camera, wherein the medical program realizes:
A receiving function of receiving various data including video data (hereinafter, referred to as real-time video data) transmitted in real time while being photographed by the medical examination apparatus;
a display control function of displaying a video represented by the received real-time video data by live broadcasting on a display unit included in the terminal device;
a recording function of controlling start/stop of recording of the real-time video data in live play in accordance with a prescribed operation performed on the terminal device;
and a recording control function for recording the real-time video data, which is recorded by the recording function, in a recording unit provided in the terminal device until the recording is stopped, as recorded video data.
[2]
The medical program according to [1], wherein,
the following functions are implemented in the display control function: causing a first button indicating start/stop of recording of the real-time video data in live play to be displayed in the display section as a touch screen provided in the terminal device,
the following functions are implemented in the recording function: and controlling the start/stop of recording of the real-time video data in live play according to the input operation of the first button.
[3]
The medical program according to [1] or [2], wherein,
the following functions are implemented in the receiving function: receiving information indicating a control request for controlling start/stop of recording of the real-time video data in response to an operation of a predetermined operation unit physically provided in the medical examination apparatus from the medical examination apparatus,
the following functions are implemented in the recording function: and controlling start/stop of recording of the real-time video data in live broadcasting according to the information indicating the control request received from the medical examination device.
[4]
The medical program according to any one of [1] to [3], wherein,
the following functions are implemented in the display control function: when the recorded video data is played and the video is displayed on the display unit, a second button for receiving an input operation of extracting still image data from the recorded video data is displayed on the display unit,
the following functions are implemented in the recording control function: when the second button is input, still image data corresponding to the still image displayed on the display unit is recorded in the recording unit.
[5]
The medical program according to any one of [1] to [4], wherein,
realize a reception function for receiving an input operation of identification information for identifying an inspection object,
the following functions are implemented in the recording control function: the real-time video data is recorded in the recording unit as the recorded video data only in the case where the input operation of the identification information is received.
[6]
The medical program according to [5], wherein,
the following functions are implemented in the recording control function: and recording the real-time video data as the recorded video data in the recording unit in association with the identification information whose input is accepted.
[7]
The medical program according to any one of [1] to [6], wherein,
realize a reception function for receiving an input operation of identification information for identifying an inspection object,
the following functions are implemented in the display control function: and displaying the video represented by the received real-time video data by live play on the display unit only when the input operation of the identification information is received.
[8]
The medical program according to [7], wherein,
the following functions are implemented in the display control function: when displaying the video represented by the real-time video data, the identification information representing the inspection object is displayed on the display unit simultaneously with the display of the video.
[9]
The medical program according to any one of [1] to [8], wherein,
the medical examination device is a device for examining an eye to be examined,
the following functions are implemented in the display control function: a third button for selecting which of the left and right eyes of the subject is the subject to be inspected is displayed on the display unit,
the following functions are implemented in the recording control function: and recording the real-time video data in the recording unit as the recorded video data in correspondence with the left and right selection information of the eye to be inspected, in accordance with the selection operation performed on the third button.
[10]
The medical program according to [9], wherein,
the following functions are implemented in the display control function: when displaying a video represented by the recorded video data, the left and right selection information is displayed on the display unit in association with the video.
[11]
The medical program according to any one of [1] to [10], wherein,
the following functions are realized, including:
a transmission function of transmitting the recorded video data to a predetermined server apparatus;
and a communication control function that performs control so that, of communication in which the reception of the real-time video data is performed by the reception function and communication in which the transmission of the recorded video data is performed by the transmission function, communication with one party is prohibited while communication with the other party is in progress.
[12]
The medical program according to any one of [1] to [11], wherein,
in the display control function, when the recorded video data is played and the video is displayed on the display unit, the video can be enlarged/reduced according to a predetermined operation to change the display range and display,
in the recording control function, it is also possible to record a video displayed by changing a display range by the display control function.
[13]
The medical program according to [12], wherein,
the following functions are implemented in the recording control function: when the second button is input during display of a video displayed by changing the display range according to the display control function, still image data corresponding to a still image of the display range displayed on the display section is recorded in the recording unit.
[14]
A medical examination system comprising a medical examination device provided with a photographing camera, a terminal device, and a server device, characterized in that,
the medical examination apparatus transmits video data generated by photographing an examination object to the terminal apparatus in real time,
the terminal device includes:
A receiving unit that receives various data including video data (hereinafter, referred to as real-time video data) that is transmitted in real time while being captured by the medical examination device;
a display control unit that displays a video represented by the received real-time video data on a display unit included in the terminal device by live broadcasting;
a recording unit that controls start/stop of recording of the real-time video data during live broadcasting in accordance with a predetermined operation performed on the terminal device;
a recording control unit that causes recording of the real-time video data by the recording unit to be performed in a recording unit provided in the terminal, the recording unit being configured to record the real-time video data as recorded video data in a period from start to stop of recording of the real-time video data;
and a transmitting unit that transmits the recorded video data to the server device.
[15]
The medical examination system according to [14], characterized in that,
the server device classifies and manages the recorded video data received from the terminal device based on identification information associated with the terminal device as information for identifying an inspection object.
[16]
The medical examination system according to [14] or [15], characterized in that,
the medical examination device functions as an access point, and cannot be connected to other devices when a communication connection is established with the terminal device.
The second embodiment described above is described in such a manner that at least the following invention can be implemented by one of ordinary skill in the art to which the invention pertains.
[17]
An image processing program for causing a computer to realize functions related to image processing, wherein,
causing the computer to implement:
a video data acquisition function of acquiring video data of a processing object;
a focusing degree calculation function of calculating a focusing degree of a frame constituting a part or all of the video data;
a still image extraction function of extracting, as extracted still image data, a part or all of frames (hereinafter referred to as candidate frames) having a predetermined or more degree of focus among the frames having the degree of focus calculated;
and a recording function of causing the extracted still image data to be recorded in a recording unit provided in the computer.
[18]
The image processing program according to [17], wherein,
in the still image extraction function, when a plurality of the candidate frames exist within a predetermined time range when the candidate frames are arranged in time series, a predetermined number of the candidate frames are excluded from the extraction targets from the candidate frames having the small focusing degree.
[19]
The image processing program according to [18], wherein,
in the still image extraction function, when the candidate frames are focused in time sequence when extracting the extracted still image data from the candidate frames, the candidate frames are excluded from the extraction targets when the candidate frames having a smaller focus than the focused frames exist within a predetermined time range from a time point of the focused frames.
[20]
The image processing program according to any one of [17] to [19], wherein,
in the still image extraction function, when extracting the extracted still image data from the candidate frames, the frame having the smallest focusing degree among the candidate frames is excluded from the extraction targets so that the total number of the extracted still image data does not exceed a predetermined upper limit number.
[21]
The image processing program according to any one of [17] to [20], wherein,
the calculation of the degree of focus in the degree of focus calculation function is realized by an edge extraction image generation process and a calculation process,
the edge extraction image generation process is to perform an edge extraction process on the frame to generate an edge extraction image,
The calculation process calculates the degree of focusing of a frame corresponding to the edge extraction image based on the distribution of edges in the edge extraction image.
[22]
The image processing program according to any one of [17] to [21], wherein,
a green component image generation process and a graying process are performed in the focusing power calculation function,
the green component image generation processing is to extract a green component among color information of each pixel constituting the frame to generate green component image data,
the gradation processing is to gradation the green component image data to generate gradation image data,
the degree of focus is calculated from the gradation image data obtained by the gradation processing.
[23]
The image processing program according to [22], wherein,
in the degree of focus calculation function of the present invention,
an adjustment process of adjusting a black level of the gradation image data is performed,
the degree of focus is calculated from the gradation image data of which black level is adjusted by this adjustment process.
[24]
The image processing program according to any one of [17] to [23], wherein,
in the focusing power calculation function, one frame is set as a target frame for calculating the focusing power at predetermined time intervals or at predetermined frame numbers among frames constituting the video data.
[25]
The image processing program according to any one of [17] to [23], wherein,
the video data is data of a video obtained by photographing eyes of a subject,
in the focusing power calculation function, the focusing power is calculated with respect to a region from which a region of a predetermined range on the upper side and/or the lower side of the frame is removed.
[26]
The image processing program according to any one of [17] to [25], wherein,
in the video data acquisition function, the acquired video data is caused to be stored in a storage unit provided in the computer,
causing the computer to implement:
and a deleting function of deleting the video data stored in the storage unit at a predetermined timing after the still image data is extracted by the still image extracting function.
[27]
The image processing program according to any one of [17] to [26], wherein,
causing the computer to implement:
a display control function of causing a display unit of the computer to display an extracted still image represented by the extracted still image data;
a receiving function for receiving a selection operation for the displayed extracted still image from a user;
The following functions are implemented in the recording function: the extracted still image data representing the extracted still image that has accepted the selection operation is recorded in the recording unit.
[28]
The image processing program according to [27], wherein,
the following functions are implemented in the display control function: when a predetermined first input operation is performed, a video represented by video data including the extracted still image is displayed on the display unit,
the following functions are implemented in the acceptance function: accepting a selection operation of the frame of the video displayed,
the following functions are implemented in the recording function: still image data representing the frame that has accepted the selection operation is recorded in the recording unit.
[29]
The image processing program according to [28], wherein,
in the display control function, when a predetermined second input operation is performed on any one of the extracted still images represented by the extracted still image data, the video data is continued to be played back after the extracted still image data.
Description of the reference numerals
100. Medical examination system
110. Medical examination device
120. Terminal device
121. Receiving part
122. Display control unit
123. Recording unit
124. Recording control unit
125. Receiving part
126. Transmitting unit
127. Communication control unit
130. Server device
210. Medical examination device
220. Image processing apparatus and method
221. Video data acquisition unit
222. Focusing degree calculating part
223. Still image extraction unit
224. Display control unit
225. Receiving part
226. Recording unit
227. Deletion part
230. Server device

Claims (15)

1. A medical program for realizing various processes in a terminal device used with a medical examination device having a photographing camera, wherein the medical program realizes the following functions, comprising:
a receiving function of receiving various data including video data transmitted in real time while being photographed by the medical examination apparatus, the video data being hereinafter referred to as real-time video data;
a display control function of displaying a video represented by the received real-time video data by live broadcasting on a display unit included in the terminal device;
a recording function of controlling start/stop of recording of the real-time video data in live play in accordance with a prescribed operation performed on the terminal device;
And a recording control function of recording the real-time video data, which is recorded by the recording function, in a recording unit provided in the terminal device until the recording is stopped, as recorded video data.
2. The medical program according to claim 1, wherein,
the following functions are implemented in the display control function: a first button for displaying a start/stop instruction of recording of the real-time video data in live broadcasting on the display section as a touch screen provided in the terminal device,
the following functions are implemented in the recording function: and controlling the start/stop of recording of the real-time video data in live play according to the input operation of the first button.
3. The medical program according to claim 1, wherein,
the following functions are implemented in the receiving function: receiving information indicating a control request for starting/stopping recording of the real-time video data in response to an operation of a predetermined operation unit physically provided in the medical examination apparatus from the medical examination apparatus,
the following functions are implemented in the recording function: and controlling start/stop of recording of the real-time video data in live broadcasting according to the information indicating the control request received from the medical examination device.
4. The medical program according to claim 1, wherein,
the following functions are implemented in the display control function: when the recorded video data is played and the video is displayed on the display unit, a second button is displayed on the display unit, the second button is used for receiving an input operation of extracting the still image data from the recorded video data,
the following functions are implemented in the recording control function: when the second button is input, still image data corresponding to the still image displayed on the display unit is recorded in the recording unit.
5. The medical program according to claim 1, wherein,
realize a reception function for receiving an input operation of identification information for identifying an inspection object,
the following functions are implemented in the recording control function: and recording the real-time video data in the recording unit as the recorded video data only in the case where the input operation of the identification information is received.
6. The medical program according to claim 1, wherein,
realize a reception function for receiving an input operation of identification information for identifying an inspection object,
The following functions are implemented in the display control function: and displaying the video represented by the received real-time video data by live play on the display unit only when the input operation of the identification information is received.
7. The medical program according to claim 1, wherein,
the medical examination device is used for examining the examined eyes,
the following functions are implemented in the display control function: a third button for selecting which of the left and right eyes of the subject is the subject to be inspected is displayed on the display unit,
the following functions are implemented in the recording control function: and recording the real-time video data in the recording unit as the recorded video data in correspondence with the left and right selection information of the eye to be inspected, in accordance with the selection operation performed on the third button.
8. The medical program according to claim 1, wherein the following functions are realized, including:
a transmission function of transmitting the recorded video data to a predetermined server apparatus;
and a communication control function that performs control so that, of communication in which the reception of the real-time video data is performed by the reception function and communication in which the transmission of the recorded video data is performed by the transmission function, communication with one party is prohibited while communication with the other party is in progress.
9. The medical program according to claim 1, wherein,
in the display control function, when the recorded video data is played and the video is displayed on the display unit, the video can be enlarged/reduced according to a predetermined operation to change the display range and display,
in the recording control function, it is also possible to record a video displayed by changing a display range by the display control function.
10. The medical program according to claim 1, wherein the following functions are realized, including:
a focusing degree calculation function of calculating a focusing degree of a frame constituting a part or all of the recorded video data;
a still image extraction function of extracting, as extracted still image data, a part or all of the frames having a predetermined or higher degree of focus among the frames having a predetermined or higher degree of focus calculated, and referring to the frames having a predetermined or higher degree of focus among the frames having a predetermined or higher degree of focus calculated as candidate frames;
and a recording function of recording the extracted still image data in a recording unit provided in the terminal device.
11. The medical program according to claim 10, wherein,
In the still image extraction function, when a plurality of the candidate frames exist within a predetermined time range when the candidate frames are arranged in time series, a predetermined number of the candidate frames are excluded from the extraction targets from the candidate frames having the small focusing degree.
12. The medical program according to claim 10, wherein,
in the still image extraction function, when extracting the extracted still image data from the candidate frames, the frame having the smallest focusing degree among the candidate frames is excluded from the extraction targets so that the total number of the extracted still image data does not exceed a predetermined upper limit number.
13. A medical examination system comprising a medical examination device provided with a photographing camera, a terminal device, and a server device, characterized in that,
the medical examination apparatus transmits video data generated by photographing an examination object to the terminal apparatus in real time,
the terminal device includes:
a receiving unit that receives various data including video data that is transmitted in real time while being captured by the medical examination apparatus, and the video data that is transmitted in real time while being captured by the medical examination apparatus is hereinafter referred to as real-time video data;
A display control unit that displays a video represented by the received real-time video data on a display unit included in the terminal device by live broadcasting;
a recording unit that controls start/stop of recording of the real-time video data during live broadcasting in accordance with a predetermined operation performed on the terminal device;
a recording control unit that causes recording of the real-time video data by the recording unit to be performed in a recording unit provided in the terminal, the recording unit being configured to record the real-time video data as recorded video data in a period from start to stop of recording of the real-time video data;
and a transmitting unit that transmits the recorded video data to the server device.
14. The medical examination system according to claim 13, wherein,
the server device classifies and manages the recorded video data received from the terminal device based on identification information associated with the terminal device as information for identifying an inspection object.
15. The medical examination system according to claim 13, wherein,
the medical examination device functions as an access point, and cannot be connected to other devices when a communication connection is established with the terminal device.
CN202280042725.XA 2021-06-17 2022-06-16 Medical program and medical examination system Pending CN117500429A (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2021-101006 2021-06-17
JP2021-154875 2021-09-22
JP2021154875 2021-09-22
PCT/JP2022/024128 WO2022265064A1 (en) 2021-06-17 2022-06-16 Medical program and medical examination system

Publications (1)

Publication Number Publication Date
CN117500429A true CN117500429A (en) 2024-02-02

Family

ID=89669498

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202280042725.XA Pending CN117500429A (en) 2021-06-17 2022-06-16 Medical program and medical examination system

Country Status (1)

Country Link
CN (1) CN117500429A (en)

Similar Documents

Publication Publication Date Title
US8532345B2 (en) Camera and image recording program product
JP5246275B2 (en) Imaging apparatus and program
US9491366B2 (en) Electronic device and image composition method thereof
JP4961965B2 (en) Subject tracking program, subject tracking device, and camera
JP5784859B2 (en) Image management device
JP5490477B2 (en) Hyperemia calculation program and hyperemia calculation device
JP2008035149A (en) Video recording and reproducing system and video recording and reproducing method
US8934699B2 (en) Information processing apparatus, information processing method, program, and recording medium
WO2018142664A1 (en) Endoscopic image observation assistance system
CN111202494A (en) Skin analysis device, skin analysis method, and recording medium
JP2014053723A (en) Medical image management device, medical image management method and medical image management program
JP2018084861A (en) Information processing apparatus, information processing method and information processing program
JP2007049631A (en) Imaging apparatus
CN117500429A (en) Medical program and medical examination system
JP6425868B1 (en) ENDOSCOPIC IMAGE OBSERVATION SUPPORT SYSTEM, ENDOSCOPIC IMAGE OBSERVATION SUPPORT DEVICE, AND ENDOSCOPIC IMAGE OBSERVATION SUPPORT METHOD
JP6335412B1 (en) Endoscopic image observation support system
WO2022265064A1 (en) Medical program and medical examination system
JP2013121097A (en) Imaging apparatus, imaging method, image generating apparatus, image generating method and program
JP6935663B1 (en) Oral mucosal disease diagnosis support system, method and program
US11595584B2 (en) Imaging apparatus, method of controlling imaging apparatus and computer-readable medium
US11880975B2 (en) Information processing apparatus, method for controlling information processing apparatus, and storage medium
US20210042924A1 (en) Information processing apparatus, method for controlling information processing apparatus, and storage medium
US20230206388A1 (en) Method and device for outputting pathology slide image
JP2012133459A (en) Apparatus, method and program for estimating image, and computer-readable recording medium storing the program
WO2022145294A1 (en) Image processing apparatus, image capture apparatus, image processing method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40100087

Country of ref document: HK