WO2020116351A1 - Dispositif d'aide au diagnostic et programme d'aide au diagnostic - Google Patents

Dispositif d'aide au diagnostic et programme d'aide au diagnostic Download PDF

Info

Publication number
WO2020116351A1
WO2020116351A1 PCT/JP2019/046856 JP2019046856W WO2020116351A1 WO 2020116351 A1 WO2020116351 A1 WO 2020116351A1 JP 2019046856 W JP2019046856 W JP 2019046856W WO 2020116351 A1 WO2020116351 A1 WO 2020116351A1
Authority
WO
WIPO (PCT)
Prior art keywords
disease
data
output
support device
diagnosis
Prior art date
Application number
PCT/JP2019/046856
Other languages
English (en)
Japanese (ja)
Inventor
壮平 宮崎
祐輔 坂下
佳紀 熊谷
友洋 宮城
Original Assignee
株式会社ニデック
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ニデック filed Critical 株式会社ニデック
Priority to JP2020559148A priority Critical patent/JPWO2020116351A1/ja
Publication of WO2020116351A1 publication Critical patent/WO2020116351A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems

Definitions

  • the present disclosure relates to a diagnosis support device and a diagnosis support program that support diagnosis of an eye to be inspected.
  • an ophthalmologic imaging apparatus for example, an optical coherence tomography (OCT), a fundus camera, a laser scanning optometry apparatus (SLO), etc.
  • OCT optical coherence tomography
  • SLO laser scanning optometry apparatus
  • the present disclosure has a technical problem to provide a diagnostic support device and a diagnostic support program that facilitate a definitive diagnosis based on disease information.
  • the present disclosure is characterized by having the following configurations.
  • a diagnosis support apparatus for supporting diagnosis of an eye to be inspected wherein a control means of the diagnosis support apparatus acquires disease information based on at least one examination data of the eye to be inspected and uses the disease information. It is characterized in that the output means outputs information corresponding to the selected disease of interest.
  • a diagnosis support program executed in a diagnosis support apparatus for supporting diagnosis of an eye to be inspected which is executed by a processor of the diagnosis support apparatus, and is based on at least one examination data of the eye to be inspected. It is characterized in that the diagnosis support apparatus is made to perform an acquisition step of acquiring disease information and an output step of causing the output means to output information corresponding to the target disease selected using the disease information.
  • FIG. 3 is a block diagram showing a schematic configuration of a diagnosis support device 1.
  • 5 is a flowchart showing a control operation of the diagnosis support device 1. It is a figure which shows an example of the result screen of an automatic analysis. It is a figure which shows an example of a definite diagnosis screen. It is a figure which shows an example of a definite diagnosis screen.
  • the diagnosis support apparatus of the present disclosure supports diagnosis of an eye to be inspected.
  • the diagnosis support device includes, for example, a control unit (for example, the control unit 11).
  • the control unit acquires disease information (for example, information on a disease, a finding, an abnormal site, or the like) based on at least one examination data of the eye to be inspected.
  • the control unit may obtain disease information accumulated in a database or the like via the Internet, or may obtain disease information stored locally.
  • the control unit causes the output unit (for example, the display unit 15 or the like) to output information corresponding to the disease of interest selected using the disease information.
  • the diagnosis support apparatus can save the trouble of selecting and outputting the corresponding test data when the user makes a definite diagnosis for the disease of interest.
  • the disease of interest may include information such as a disease, a finding, or an abnormal site.
  • the inspection data may be a captured image of the eye to be inspected or an inspection value of the eye to be inspected.
  • Examples of the captured image include a fundus image and an anterior segment image.
  • the fundus image is an image of the fundus of the subject's eye.
  • the fundus image may be a fundus tomographic image captured by, for example, an optical coherence tomography (OCT) or a Scheimpflug camera, or may be captured by a fundus camera, SLO (SLO: Scanning Laser Ophthalmoscope), or the like.
  • OCT optical coherence tomography
  • SLO Scanning Laser Ophthalmoscope
  • the anterior segment image may be, for example, an anterior segment front image captured by an anterior segment observation camera, or an anterior segment OCT image, a toe image, a slit lamp image, a corner mirror image, and the like. May be.
  • the inspection value is, for example, an eye refractive power, a visual acuity value, an intraocular pressure value, a visual field inspection value, or the like.
  • the visual field inspection value is, for example, as a parameter on the perimeter side, an MD value (Humfree perimeter), an average value of the visual sensitivity threshold, PSD (pattern standard deviation), CPSD (corrected pattern standard deviation), SF (short-term variation). And so on.
  • control unit may output a captured image showing a disease, a finding, or an abnormal part to the output unit when outputting the captured image as information corresponding to the disease of interest.
  • the information corresponding to the disease of interest may include, for example, test data different from the test data used to obtain the disease information.
  • the diagnosis support apparatus can provide the user with additional information when the user makes a definite diagnosis for the disease of interest.
  • the control unit may cause the output unit to output a plurality of test data as information corresponding to the disease of interest.
  • control unit may accept an operation signal.
  • the operation signal is output from, for example, an operation unit (for example, the operation unit 16) operated by the user.
  • the control unit may cause the output unit to output information corresponding to the disease of interest selected based on the operation signal.
  • the control unit may display the disease information and the test data used to obtain the disease information on the display unit.
  • the user may select the disease of interest based on the disease information and the test data displayed on the display unit.
  • control unit may change or add the output of information in the output unit each time another disease of interest is selected by the user. For example, the control unit may change the type of test data displayed on the display unit based on the selection of the disease of interest.
  • the control unit may cause the output unit to output information corresponding to the disease of interest selected based on the specific condition.
  • the specific condition may be a condition set based on the certainty factor of the disease information (for example, the probability of existence of a disease, a finding, an abnormal site, etc.).
  • information corresponding to the disease of interest with high certainty may be displayed on the display unit.
  • the specific condition may be set by the user.
  • the control unit may acquire disease information by inputting test data into a mathematical model trained by a machine learning algorithm.
  • the mathematical model may be configured to output the existence probabilities of each of a plurality of diseases, findings, abnormal sites, etc. based on the input of the inspection data, for example.
  • the control unit may cause the output unit to output the transition of the similar case data that is similar to the examination data and/or the disease information among the case data accumulated in the database (for example, the data server 30). For example, the control unit may determine the degree of similarity between test data and/or disease information and past case data. At this time, the control unit may determine the degree of similarity based on the feature amount of each data. The control unit causes the output unit to output the transition of a case with a high degree of similarity.
  • the transition of the similar case data may be transition of image data, test value, treatment content, similar region, or different region, or a combination thereof.
  • a condition based on the size of the similarity for example, up to any arbitrary rank
  • a condition set by default for example, up to any arbitrary rank
  • the conditions may be set.
  • the processor (for example, the CPU 12) of the diagnostic support device may execute the diagnostic support program stored in the storage unit (for example, the storage unit 13).
  • the diagnosis support program includes, for example, an acquisition step and an output step.
  • the acquisition step is a step of acquiring disease information based on at least one examination data of the eye to be inspected.
  • the output step is a step of causing the output unit to output information corresponding to the focused disease selected using the disease information.
  • the diagnosis support apparatus assists the user in diagnosing the eye to be inspected.
  • the diagnosis support device acquires, for example, at least one examination data of the eye to be inspected, and provides the user with information for supporting the diagnosis.
  • the diagnosis support device is realized by, for example, a personal computer (hereinafter referred to as “PC”).
  • the diagnosis support device 10 includes, for example, a control unit 11 and a communication unit 14.
  • the control unit 11 controls the diagnosis support device 10.
  • the control unit 11 includes a CPU 12 that is a controller that controls the control, and a storage unit 13 that can store programs and data.
  • the storage unit 13 stores a diagnosis support program for supporting the user's diagnosis.
  • the communication unit 14 connects the diagnosis support apparatus 10 to another device (for example, the data server 30) via the network 60 (for example, the Internet).
  • the diagnosis support device 10 may include a display unit 15, an operation unit 16 and the like.
  • the display unit 15 displays inspection data, diagnosis support information, and the like.
  • the display on the display unit 15 is controlled by the control unit 11.
  • various devices capable of displaying images for example, at least one of a monitor, a display, a projector, etc.
  • the “image” in the present disclosure includes both still images and moving images.
  • the operation unit 16 is operated by the user so that the user inputs various instructions to the diagnosis support apparatus 10.
  • the operation unit 16 for example, at least one of a keyboard, a mouse, a touch panel, etc. can be used.
  • the control unit 11 can exchange test data with the ophthalmologic apparatus 20.
  • the method by which the control unit 11 exchanges the examination data with the ophthalmologic apparatus 20 can be appropriately selected.
  • the control unit 11 may exchange test data with the ophthalmologic apparatus 20 by at least one of wired communication, wireless communication, a removable storage medium (for example, a USB memory), and the like.
  • the diagnosis support device 10 is connected to the data server 30 via the network 60. Thereby, the control unit 11 can also acquire the inspection data accumulated in the data server 30.
  • the data server 30 stores (stores) examination data and the like acquired by the ophthalmologic apparatus 50 different from the ophthalmologic apparatus 20. Therefore, the control unit 11 can acquire inspection data of a plurality of types (different modalities).
  • diagnosis support device 10 is not limited to a PC, and may be realized by an ophthalmologic device, a tablet terminal, or a mobile terminal such as a smartphone. Further, the control units (for example, the control unit 11 and the control unit 21 of the ophthalmologic apparatus 20) of a plurality of devices may cooperate to function as the diagnosis support apparatus 10.
  • the CPU is used as an example of the controller that performs various processes, but a controller other than the CPU may be used for at least a part of the various devices.
  • a GPU may be used as the controller to speed up the process.
  • the ophthalmologic apparatus 20 will be described.
  • the ophthalmologic apparatus 20 of the present embodiment is an OCT apparatus capable of capturing a tomographic image or the like of the tissue of the subject's eye.
  • the ophthalmologic apparatus 20 may be an ophthalmologic apparatus other than the OCT apparatus.
  • the ophthalmologic apparatus 20 includes a laser scanning optometry apparatus (SLO), a fundus camera, a Scheimpflug camera, a corneal endothelial cell photographing apparatus, an eye refractive power measuring apparatus, a cornea measuring apparatus, a corner angle photographing apparatus, a tonometer, or a perimeter. And so on.
  • the ophthalmologic apparatus 20 includes a control unit 21 that performs various control processes and an imaging unit 24.
  • the control unit 21 includes a CPU 22 that is a controller that controls the control, and a storage unit 23 that can store programs and data.
  • the image capturing unit 24 has various configurations necessary for capturing an ophthalmologic image of the subject's eye.
  • the imaging unit 24 of the present embodiment includes an OCT light source, a branching optical element that branches the OCT light emitted from the OCT light source into measurement light and reference light, a scanning unit for scanning the measurement light, and the measurement light to the subject's eye.
  • An optical system for irradiating, a light receiving element for receiving the combined light of the light reflected by the tissue of the eye to be examined and the reference light, and the like are included.
  • the ophthalmologic apparatus 20 can take a two-dimensional tomographic image and a three-dimensional tomographic image of the fundus of the eye to be inspected.
  • the control unit 21 scans the scan line with OCT light (measurement light) to capture a two-dimensional tomographic image of a cross section intersecting the scan line.
  • the two-dimensional tomographic image may be an arithmetic mean image generated by performing arithmetic mean processing on a plurality of tomographic images of the same site.
  • the control unit 21 can also capture a three-dimensional tomographic image of the tissue by two-dimensionally scanning the OCT light.
  • control unit 21 acquires a plurality of two-dimensional tomographic images by scanning measurement light on each of a plurality of scan lines whose positions are different from each other in a two-dimensional region when the tissue is viewed from the front. To do. Next, the control unit 21 acquires a three-dimensional tomographic image by combining a plurality of captured two-dimensional tomographic images.
  • the data server 30 stores the inspection data.
  • the data server 30 acquires the examination data of the ophthalmologic apparatus 50 via the terminal device 40.
  • the data server 30 may directly acquire the examination data from the ophthalmologic apparatus 50 without using the terminal device 40.
  • the data server 30 may acquire the examination data not only from the ophthalmologic apparatus 50 but also from other ophthalmologic apparatuses connected to the network 60.
  • the data server 30 may acquire and store the inspection data of the ophthalmologic apparatus 20.
  • the terminal device 40 is, for example, a PC or the like that can exchange examination data with the ophthalmologic apparatus 50.
  • the terminal device 40 includes, for example, a control unit 41 and a communication unit 44.
  • the control unit 41 controls the terminal device 40.
  • the control unit 41 includes a CPU 42 that is a controller that controls the control, and a storage unit 43 that can store programs and data.
  • the communication unit 44 connects the terminal device 40 to another device (for example, the data server 30) via the network 60 (for example, the Internet).
  • the terminal device 40 may include a display unit 45, an operation unit 46, and the like.
  • the ophthalmologic apparatus 50 will be described.
  • the ophthalmologic apparatus 50 of the present embodiment is a non-contact tonometer that can measure the intraocular pressure of the subject's eye in a non-contact manner.
  • the ophthalmologic apparatus 50 may be an ophthalmologic apparatus other than the tonometer.
  • the ophthalmologic apparatus 50 may be an OCT, an SLO, a fundus camera, a Scheimpflug camera, a corneal endothelial cell photographing device, an eye refractive power measuring device, a cornea measuring device, a corner angle photographing device, or a perimeter.
  • the ophthalmologic apparatus 50 includes a control unit 51 that performs various control processes and an inspection unit 54.
  • the control unit 51 includes a CPU 52 that is a controller that controls the control, and a storage unit 53 that can store programs and data.
  • the inspection unit 54 includes various components necessary for measuring the intraocular pressure of the eye to be inspected.
  • the inspection unit 54 of the present embodiment includes a fluid ejection unit that ejects air to the eye to be inspected, a deformation detection unit that detects deformation of the cornea, and the like.
  • the inspection unit 54 measures, for example, the intraocular pressure based on the pressure when air is ejected to the eye to be inspected and the deformed state of the cornea.
  • the diagnosis support device 10 acquires disease information of the eye to be inspected based on the inspection data.
  • the control unit 11 acquires disease information by automatic analysis using a mathematical model stored in the storage unit 13.
  • the mathematical model is trained by, for example, a machine learning algorithm.
  • the mathematical model outputs, for example, the probability that each disease, finding, or abnormal site exists in the eye to be examined.
  • the control unit 11 inputs the inspection data into the mathematical model to acquire the existence probabilities of each of the plurality of diseases and the like.
  • Mathematical model is constructed by a mathematical model construction process.
  • the mathematical model is trained by the training data set, and thereby the mathematical model that outputs the probability that the disease or the finding of the eye to be examined exists is constructed.
  • the training data set includes data on the input side (training data for input) and data on the output side (training data for output).
  • Mathematical model learns training data set based on machine learning algorithm.
  • Neural networks, random forests, boosting, support vector machines (SVM), etc. are generally known as machine learning algorithms.
  • Neural network is a method that mimics the behavior of the nerve cell network of a living being.
  • Examples of the neural network include feedforward (forward propagation type) neural network, RBF network (radial basis function), spiking neural network, convolutional neural network, recurrent neural network (recurrent neural network, feedback neural network, etc.), probability.
  • Neural networks Boltzmann machine, Basian network, etc.).
  • Random forest is a method to generate a large number of decision trees by learning based on training data that is randomly sampled.
  • a random forest is used, the branches of a plurality of decision trees that have been learned as discriminators are traced, and the average (or majority) of the results obtained from each decision tree is taken.
  • Boosting is a method of generating a strong classifier by combining multiple weak classifiers.
  • a strong classifier is constructed by sequentially learning simple and weak classifiers.
  • SVM is a method of configuring a two-class pattern classifier using a linear input element.
  • the SVM learns the parameters of the linear input element based on, for example, the criterion (hyperplane separation theorem) of obtaining a margin-maximized hyperplane that maximizes the distance from each data point from the training data.
  • Mathematical model refers to, for example, a data structure for predicting the relationship between input data and output data.
  • the mathematical model is constructed by being trained with the training data set.
  • the training data set is a set of input training data and output training data.
  • As the input training data the inspection data of the subject's eye acquired in the past is used.
  • As the output training data data of the diagnosis result such as the disease name and the position of the disease is used.
  • the mathematical model is trained such that when a certain input training data is input, the corresponding output training data is output. For example, training updates the correlation data (eg, weights) for each input and output.
  • a multilayer neural network is used as a machine learning algorithm.
  • a neural network includes an input layer for inputting data, an output layer for generating data to be predicted, and one or more hidden layers between the input layer and the output layer.
  • a plurality of nodes also called units
  • a convolutional neural network (CNN), which is a type of multilayer neural network, is used.
  • GAN Geneative Adversary Networks
  • the automatic analysis may be performed by clustering, pixel vector, cosine similarity, or other mathematical correlation calculation method.
  • the control operation of the diagnosis support device 10 having the above configuration will be described with reference to FIG.
  • the diagnosis support apparatus 1 provides the user with test data necessary for definitive diagnosis based on the disease information of the eye to be examined.
  • Step S1 Acquisition of inspection data
  • the control unit 11 acquires the examination data of the eye to be examined from the storage unit 13 of the ophthalmologic apparatus 20, for example.
  • the control unit 11 acquires a tomographic image (OCT image) from the ophthalmologic apparatus 20.
  • OCT image tomographic image
  • the test data acquired here is used to acquire disease information in step S2.
  • the control unit 11 may access the network 60 via the communication unit 14 and acquire the examination data of the eye to be examined from the data server 30. In this case, the control unit 11 acquires the examination data of the eye to be inspected from the data server 30 based on the information such as the designated personal ID and name.
  • Step S2 Disease information acquisition
  • the control unit 11 acquires, for example, disease information of the eye to be inspected based on the tomographic image input in step S1.
  • the control unit 11 inputs the inspection data into the mathematical model trained by the machine learning algorithm to acquire the existence probabilities for each of the plurality of diseases, findings, or abnormal parts in the eye to be inspected.
  • the control unit 11 acquires the disease information of the eye to be inspected by inputting at least one inspection data into the mathematical model.
  • the control unit 11 inputs the tomographic image into the mathematical model to acquire the probabilities of each of the plurality of diseases and the like.
  • the disease information is not limited to automatic analysis and may be acquired based on user input.
  • Step S3 disease information output
  • the control unit 11 outputs the disease information to the display unit 15.
  • FIG. 3 is an example of a result display screen 100 that displays disease information.
  • the control unit 11 displays the inspection data 110 input to the mathematical model and the button group 120 on the result display screen 100.
  • a tomographic image 111 is displayed as the inspection data 110.
  • an Enface image 112 is displayed on the result display screen 100.
  • the Enface image 112 is, for example, a two-dimensional image when the three-dimensional OCT data is viewed from the front direction of the subject's eye.
  • each disease is assigned to the button group 120, and the probability that the disease exists is displayed on each button (buttons 121 to 125).
  • the control unit 11 causes the result display screen 100 to display, for example, the test data 110 and the probability that the disease exists (confidence level).
  • the disease A is assigned to the button 121, the disease B to the button 122, the disease C to the button 123, the disease D to the button 124, and the disease E to the button 125, respectively.
  • the existence probability of each disease is displayed on each of the buttons 121 to 125.
  • the probability of disease A is 39%
  • the probability of disease B is 19%
  • the probability of disease C is 4%
  • the probability of disease D is 8%.
  • % the probability of disease E is 9%.
  • the probability does not always have to be displayed.
  • Step S4 Accept selection signal
  • the user confirms the probability of the disease displayed on each button and selects any of the buttons 121 to 125. For example, the user operates the operation unit 16 and presses any of the buttons 121 to 125.
  • the operation unit 16 transmits a selection signal based on the user's selection to the control unit 11.
  • the control unit 11 receives the selection signal output from the operation unit 16.
  • Step S5 Output of related data
  • the control unit 11 receives a selection signal indicating that any of the buttons 121 to 125 has been pressed, the control unit 11 displays a diagnostic screen for the selected disease (focused disease).
  • FIG. 4 is an example of the diagnostic screen 101.
  • the control unit 11 causes the diagnosis screen 101 to display information (related data 130) corresponding to the disease of interest.
  • the related data 130 is, for example, test data necessary for definite diagnosis of a disease, and also includes test data not used for automatic analysis.
  • the control unit 11 may acquire the related data 130 from the data server 30 at the time of receiving the selection signal, or may acquire all the examination data regarding the subject from the data server 30 in advance and store the inspection data in the storage unit 13. You may keep it.
  • FIG. 4 shows the diagnosis screen 101 when the button 121 of the disease A (for example, retinal disease) is pressed.
  • the control unit 11 controls the thickness map 131 indicating the thickness of the retina, the tomographic image 132, the retinal thickness analysis chart 133, the comparison image 134 with the retinal thickness of a normal eye, and the fundus image 135 on which the measurement result of the luminosity is superimposed.
  • the fluorescent fundus image 136, the intraocular pressure value 137, and the like captured using the contrast agent are displayed on the diagnostic screen 101.
  • the inspection data for example, tomographic image
  • a frame representing the finding detection position output in the automatic analysis, a finding detection probability map, or an abnormal part probability map is displayed. They may be displayed in an overlapping manner, or the finding detection position may be enlarged and displayed.
  • the control unit 11 may display the inspection data side by side in the order of importance, or may display a plurality of inspection data sequentially according to the passage of time or user operation. The control unit 11 may customize the diagnostic screen 101 as appropriate based on the user's operation.
  • control unit 11 may switch the display of the diagnostic screen 101 each time any one of the button group 120 is pressed. For example, when the user presses the disease B button 122, the control unit 11 receives the disease B selection signal and controls the display of the display unit 15. For example, the display unit 15 displays the test data necessary for diagnosing the disease B.
  • FIG. 5 shows the diagnostic screen 102 when the button 122 to which the disease B (for example, a disease in which abnormalities in the fundus blood vessels such as diabetic retinopathy appear) is pressed.
  • the control unit 11 displays the motion contrast image (OCT angiography) 141 calculated based on the OCT signal, the tomographic image 142, and the transition 143 of similar case data on the diagnosis screen 102, for example.
  • OCT angiography the motion contrast image
  • the control unit 11 may change the type of examination data displayed on the display unit 15 according to the disease of interest.
  • an image 144 after 6 months and an image 145 after 12 months when the treatment method ⁇ is applied to the eyes of similar cases are displayed.
  • an image 146 after 6 months when the treatment method ⁇ is applied to the eyes of similar cases and an image 147 after 12 months are displayed.
  • the control unit 11 When displaying the transition of similar case data, for example, the control unit 11 calculates the degree of similarity between the examination data of the eye to be inspected and the past case data accumulated in the data server 30. Then, the control unit 11 selects similar case data satisfying a specific condition and displays the transition on the diagnosis screen 102.
  • the specific condition may be set by, for example, the degree of similarity, the newness of the case date, or any conditional expression, or may be set by the user arbitrarily. In each image of similar cases, the degree of similarity, the similar area, the degree of difference, or the different area may be displayed.
  • the method of calculating the degree of similarity may be machine learning, clustering, pixel vectors, various statistical methods, and other mathematical correlation calculation methods. Further, in calculating the degree of similarity, the feature amount output in the automatic analysis may be used.
  • control unit 11 may perform a display prompting to capture the inspection data, a display method of capturing the inspection data, and the like. Also, a button for communicating with the image capturing device may be displayed so that the image can be captured as it is.
  • the user checks the relevant data displayed on each diagnosis screen for each disease included in the disease information, and makes the final diagnosis.
  • the user may input the final diagnosis result into the diagnosis support device 10.
  • the diagnosis result input to the diagnosis support device 10 may be used for learning a mathematical model.
  • the diagnosis support device 10 of the present embodiment selects and presents information necessary for the user to make a definitive diagnosis based on the disease information of the eye to be examined.
  • the user can efficiently perform definitive diagnosis and treatment planning. For example, as compared with the case where diagnosis is performed after referring to all test data in large numbers, by giving priority to reference to test data that is likely to contribute to definitive diagnosis, a definitive diagnosis with a small number of data references and reference time is made. Is possible.
  • the diagnosis support device 10 can save the user the trouble of referring to the inspection data that does not contribute to the definitive diagnosis.
  • the diagnosis support apparatus 10 can reduce the oversight of findings by the user by displaying the inspection data related to the definitive diagnosis in a list.
  • diagnosis support device 10 can display the transition of similar case data so that the user can refer to a typical transition of similar cases in the past (including recent cases) when a user makes a treatment plan. .. As a result, the diagnosis support device 10 can assist the user's knowledge and support a more appropriate treatment plan.
  • the result display screen 100 transits to the diagnostic screens 101 and 102 when the user selects a disease or the like, but the diagnostic screens 101 and 102 are automatically moved based on a specific condition, and the inspection data A list may be displayed.
  • the specific condition may be set, for example, according to the magnitude of the probability that a disease or the like exists, or may be set arbitrarily by the user.
  • control unit 11 may output the position of each disease (specifically, the position where it is determined that the disease may exist) to the result display screen 100 or the like in the automatic analysis.
  • the display instead of switching the display to the diagnostic screens 101 and 102 by pressing the button group 120, the display may be switched by selecting the position of the disease.
  • the control unit 11 diagnoses the examination data of the other eye of the same subject when the disease information is acquired by the examination data of one of the left eye and the right eye of the same subject. It may be displayed on the screen 101.
  • the user can diagnose the other eye based on the diagnosis result of the other eye. For example, when glaucoma or the like develops in one eye, it tends to develop in the other eye. Therefore, based on the disease information on one eye, the inspection data of the other eye (for example, the eye having a low degree of disease) is also output, so that the diagnosis can be performed more appropriately.
  • the inspection data output method is not limited to the display on the display unit 15.
  • the inspection data may be transmitted to another device or may be output as a report.
  • the method of outputting the report may be a method of printing on paper or a method of outputting data in a specific format (for example, PDF data or the like).
  • the ophthalmologic apparatus 20 may be a composite apparatus capable of acquiring examination data of a plurality of different modalities.
  • the control unit 11 may acquire examination data of multiple types of modalities from the ophthalmologic apparatus 20.
  • the control unit 11 performs automatic analysis using at least some types of inspection data acquired from the ophthalmologic apparatus 20, and displays related data including other types of inspection data on the diagnostic screen based on the automatic analysis result. You may let me.
  • control unit 12 CPU 13 storage unit 14 communication unit 20 ophthalmic device 30 data server 40 terminal device 50 ophthalmic device 60 network

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Public Health (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Molecular Biology (AREA)
  • Physics & Mathematics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Ophthalmology & Optometry (AREA)
  • Biophysics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Pathology (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Eye Examination Apparatus (AREA)
  • Medical Treatment And Welfare Office Work (AREA)

Abstract

La présente invention aborde le problème technique consistant à fournir : un dispositif d'aide au diagnostic qui facilite la conduite d'un diagnostic définitif sur la base d'informations de maladie ; et un programme d'aide au diagnostic. Ce dispositif d'aide au diagnostic est destiné à aider au diagnostic de l'œil d'un sujet, et un moyen de commande du dispositif d'aide au diagnostic est caractérisé par l'acquisition d'informations de maladie sur la base d'au moins un élément de données d'examen concernant l'œil du sujet et l'émission, à un moyen de sortie, des informations correspondant à une maladie d'intérêt sélectionnée à l'aide des informations de maladie. Grâce à cette configuration, un diagnostic définitif fondé sur des informations de maladie est facilité.
PCT/JP2019/046856 2018-12-04 2019-11-29 Dispositif d'aide au diagnostic et programme d'aide au diagnostic WO2020116351A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2020559148A JPWO2020116351A1 (ja) 2018-12-04 2019-11-29 診断支援装置、および診断支援プログラム

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-227719 2018-12-04
JP2018227719 2018-12-04

Publications (1)

Publication Number Publication Date
WO2020116351A1 true WO2020116351A1 (fr) 2020-06-11

Family

ID=70975122

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/046856 WO2020116351A1 (fr) 2018-12-04 2019-11-29 Dispositif d'aide au diagnostic et programme d'aide au diagnostic

Country Status (2)

Country Link
JP (1) JPWO2020116351A1 (fr)
WO (1) WO2020116351A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114255869A (zh) * 2022-01-26 2022-03-29 广州天鹏计算机科技有限公司 一种医疗大数据云平台
WO2024070907A1 (fr) * 2022-09-30 2024-04-04 株式会社ニデック Dispositif de traitement d'image de fond d'œil et programme de traitement d'image de fond d'œil

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014110883A (ja) * 2012-10-30 2014-06-19 Canon Inc 画像処理装置及び画像処理方法
JP2015203920A (ja) * 2014-04-11 2015-11-16 キヤノン株式会社 類似症例検索システム、類似症例検索方法及びプログラム
JP2018121886A (ja) * 2017-01-31 2018-08-09 株式会社ニデック 画像処理装置、および画像処理プログラム

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014110883A (ja) * 2012-10-30 2014-06-19 Canon Inc 画像処理装置及び画像処理方法
JP2015203920A (ja) * 2014-04-11 2015-11-16 キヤノン株式会社 類似症例検索システム、類似症例検索方法及びプログラム
JP2018121886A (ja) * 2017-01-31 2018-08-09 株式会社ニデック 画像処理装置、および画像処理プログラム

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114255869A (zh) * 2022-01-26 2022-03-29 广州天鹏计算机科技有限公司 一种医疗大数据云平台
CN114255869B (zh) * 2022-01-26 2022-10-28 深圳市拓普智造科技有限公司 一种医疗大数据云平台
WO2024070907A1 (fr) * 2022-09-30 2024-04-04 株式会社ニデック Dispositif de traitement d'image de fond d'œil et programme de traitement d'image de fond d'œil

Also Published As

Publication number Publication date
JPWO2020116351A1 (ja) 2021-10-28

Similar Documents

Publication Publication Date Title
US11633096B2 (en) Ophthalmologic image processing device and non-transitory computer-readable storage medium storing computer-readable instructions
JP6907563B2 (ja) 画像処理装置、および画像処理プログラム
WO2018143180A1 (fr) Dispositif de traitement d'image et programme de traitement d'image
JP6878923B2 (ja) 画像処理装置、画像処理システム、および画像処理プログラム
US20180360304A1 (en) Ophthalmologic information processing device and non-transitory computer-readable storage medium storing computer-readable instructions
WO2020116351A1 (fr) Dispositif d'aide au diagnostic et programme d'aide au diagnostic
US20220284577A1 (en) Fundus image processing device and non-transitory computer-readable storage medium storing computer-readable instructions
WO2020026535A1 (fr) Dispositif de traitement d'images ophtalmiques, dispositif oct et programme de traitement d'images ophtalmiques
JP7196908B2 (ja) 眼科画像処理装置および眼科画像処理プログラム
JP2024045441A (ja) 眼科画像処理装置、および眼科画像処理プログラム
JP2021101965A (ja) 制御装置、光干渉断層撮影装置、光干渉断層撮影装置の制御方法、及びプログラム
JP2019208852A (ja) 眼科画像処理装置、および眼科画像処理プログラム
JP2019208851A (ja) 眼底画像処理装置および眼底画像処理プログラム
JP7439419B2 (ja) 眼科画像処理プログラムおよび眼科画像処理装置
JP2022082077A (ja) 眼科画像処理装置、および、眼科画像処理プログラム
JP2020058615A (ja) 画像処理装置、学習済モデル、画像処理方法およびプログラム
JP2022138552A (ja) 眼科画像処理装置、眼科画像処理プログラム、および眼科画像撮影装置
JP2021074095A (ja) 眼科画像処理装置および眼科画像処理プログラム
JP7302184B2 (ja) 眼科画像処理装置、および眼科画像処理プログラム
JP2020036837A (ja) 眼科画像処理装置、および眼科撮影装置
JP7468163B2 (ja) 眼科画像処理プログラムおよび眼科画像処理装置
JP6568375B2 (ja) 眼科情報処理システム、画像処理装置、および画像処理方法
WO2020241794A1 (fr) Dispositif de traitement d'image ophtalmique, programme de traitement d'image ophtalmique et système de traitement d'image ophtalmique
JP7210927B2 (ja) 眼科画像処理装置、oct装置、および眼科画像処理プログラム
JP7180187B2 (ja) 眼科画像処理装置、oct装置、および眼科画像処理プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19892237

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020559148

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19892237

Country of ref document: EP

Kind code of ref document: A1