WO2021020187A1 - Medical system, medical device and medical program - Google Patents

Medical system, medical device and medical program Download PDF

Info

Publication number
WO2021020187A1
WO2021020187A1 PCT/JP2020/027992 JP2020027992W WO2021020187A1 WO 2021020187 A1 WO2021020187 A1 WO 2021020187A1 JP 2020027992 W JP2020027992 W JP 2020027992W WO 2021020187 A1 WO2021020187 A1 WO 2021020187A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
image
evaluation
input information
unit
Prior art date
Application number
PCT/JP2020/027992
Other languages
French (fr)
Japanese (ja)
Inventor
知明 久慈
高呂 梅田
純平 竹下
Original Assignee
知明 久慈
高呂 梅田
純平 竹下
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 知明 久慈, 高呂 梅田, 純平 竹下 filed Critical 知明 久慈
Priority to JP2021536951A priority Critical patent/JPWO2021020187A1/ja
Publication of WO2021020187A1 publication Critical patent/WO2021020187A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/24Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for the mouth, i.e. stomatoscopes, e.g. with tongue depressors; Instruments for opening or keeping open the mouth
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems

Definitions

  • the present invention relates to medical systems, medical devices, and medical programs to assist in the diagnosis of a human or animal subject.
  • Patent Documents 1 and 2 Conventionally, disclosure techniques of Patent Documents 1 and 2 have been proposed as techniques for observing patients using images.
  • the oral cavity inserter of Patent Document 1 is an oral cavity inserter that is attached to a monitor device provided with a long scope portion and inserted into the oral cavity, and is long and has a hole through which the scope portion is inserted. It has a cover member, a long flat plate-shaped tongue pressure portion provided on a side portion of the cover member, and a fixing means for fixing the cover member to the monitor device, and at least a tip portion of the tongue pressure portion.
  • the width of the cover member is larger than the width of the cover member.
  • the intraoral imaging system of Patent Document 2 is an intra-oral imaging system, which includes a manually positionable capture device, a display, a processor, and non-temporary. -transitory) A memory that receives live image data of a patient from the intraoral image capture device, displays the live image data on the display, and is previously stored from the non-temporary memory of the patient. An intraoral image is accessed, an alignment mask is generated based on the previously accessed and previously stored intraoral image, and the alignment mask is superimposed on the live image data and displayed on the display. Stores instructions that cause the system to capture a new intraoral image of the patient from the live image data and store the new intraoral image in the non-temporary memory when executed by the processor. It has a non-temporary memory.
  • Patent Document 1 merely observes the oral cavity and pharyngeal cavity while observing the captured image (image), and cannot quantitatively evaluate the disease based on the image.
  • Patent Document 2 merely assists a dental specialist to capture a series of dental images having appropriate and consistent alignment using an imaging device, and is used only to assist a dental specialist in capturing an image-based disease. It is not possible to make a quantitative evaluation.
  • an object of the present invention is to provide a medical system, a medical device, and a medical program capable of improving the accuracy in disease evaluation based on images. To provide.
  • the medical system includes an irradiation means for irradiating an object which is a human or an animal with irradiation light, a first imaging means for capturing a first image of the object irradiated with the irradiation light, and the first imaging means.
  • An acquisition means for acquiring input information including one image, past input information, reference information corresponding to the past input information, and three or more levels of association between the past input information and the reference information.
  • An evaluation that refers to the reference database in which, is stored, and generates evaluation information including a first degree of association between the input information and the reference information acquired by the acquisition means in three or more stages.
  • the reference information includes means and an output means for generating an evaluation result based on the evaluation information and outputting the evaluation result, and the reference information includes disease information related to a disease, prescription drug information related to a prescription drug, and treatment information related to treatment. It is characterized by including at least one of them.
  • the medical system includes a first imaging means for capturing a two-dimensional first image of a human or animal object, a second imaging means for capturing a three-dimensional second image of the object, and the first image pickup means.
  • the acquisition means for acquiring the input information including the one image and the second image, the past input information, the reference information corresponding to the past input information, and the past input information and the reference information.
  • a reference database in which three or more levels of association are stored, and a first degree of association of three or more levels between the input information and the reference information acquired by the acquisition means by referring to the reference database.
  • the reference information includes an evaluation means for generating evaluation information and an output means for generating an evaluation result based on the evaluation information and outputting the evaluation result, and the reference information includes disease information related to a disease and prescription drug information related to a prescription drug. And include at least one of the treatment information regarding the treatment.
  • the medical device includes an irradiation unit that irradiates an object that is a human or an animal with irradiation light, a first imaging unit that captures a first image of the object irradiated with the irradiation light, and the first imaging unit.
  • a reference database in which, is stored, an evaluation unit that refers to the reference database and generates evaluation information including a first degree of association between the input information and the reference information acquired by the acquisition means, and the evaluation unit.
  • the reference information includes at least one of disease information regarding a disease, prescription drug information regarding a prescription drug, and treatment information regarding treatment, comprising an output means for generating an evaluation result based on the evaluation information and outputting the evaluation result. It is characterized by including.
  • the medical device includes a first imaging unit that captures a two-dimensional first image of a human or animal object, a second imaging unit that captures a three-dimensional second image of the object, and the first image pickup unit. Between the acquisition unit that acquires the input information including the one image and the second image, the past input information, the reference information corresponding to the past input information, and the past input information and the reference information.
  • a reference database in which three or more levels of association are stored, and a first degree of association of three or more levels between the input information and the reference information acquired by the acquisition means by referring to the reference database.
  • the reference information includes an evaluation unit that generates evaluation information and an output means that generates an evaluation result based on the evaluation information and outputs the evaluation result, and the reference information includes disease information related to a disease and prescription drug information related to a prescription drug. And at least one of the treatment information regarding the treatment.
  • the medical program according to the present invention includes an irradiation step of irradiating an object which is a human or an animal with irradiation light, a first imaging step of imaging a first image of the object irradiated with the irradiation light, and the first imaging step.
  • the reference database in which, is stored, the reference database is referred to, and an evaluation step of generating evaluation information including a first degree of association between the input information acquired by the acquisition step and the reference information, and A computer is made to execute an output step of generating an evaluation result based on the evaluation information and outputting the evaluation result, and the reference information is of disease information related to a disease, prescription drug information related to a prescription drug, and treatment information related to treatment. It is characterized by including at least one of them.
  • the medical program according to the present invention includes a first imaging step of capturing a two-dimensional first image of a human or animal object, a second imaging step of capturing a three-dimensional second image of the object, and the first imaging step.
  • a first imaging step of capturing a two-dimensional first image of a human or animal object Between the acquisition step of acquiring the input information including the one image and the second image, the past input information, the reference information corresponding to the past input information, and the past input information and the reference information.
  • the reference database is referred to, and the first degree of association of three or more levels between the input information acquired by the acquisition step and the reference information is obtained.
  • a computer is made to execute an evaluation step of generating evaluation information including the evaluation information and an output step of generating an evaluation result based on the evaluation information and outputting the evaluation result, and the reference information is disease information related to a disease and a prescription drug. It is characterized by including at least one of prescription drug information regarding treatment and treatment information regarding treatment.
  • FIG. 1 is a schematic view showing a configuration example of the treatment support system according to the first embodiment.
  • 2 (a) and 2 (b) are schematic views showing an example of the evaluation result in the first embodiment.
  • FIG. 3 is a schematic diagram showing an example of a first image including incidental information.
  • FIG. 4A is a schematic diagram showing an example of the configuration of the treatment support device in the embodiment, and
  • FIG. 4B is a schematic diagram showing an example of the function of the treatment support device in the first embodiment.
  • FIG. 5 is a schematic diagram showing an example of a reference database according to the first embodiment.
  • FIG. 6 is a schematic diagram showing a first modification of the reference database according to the first embodiment.
  • FIG. 7 is a schematic diagram showing a second modification of the reference database according to the first embodiment.
  • FIG. 1 is a schematic view showing a configuration example of the treatment support system according to the first embodiment.
  • 2 (a) and 2 (b) are schematic views showing an example of the evaluation result in the first embodiment.
  • FIG. 8 is a schematic view showing an example of the configuration of the medical device according to the first embodiment.
  • FIG. 9 is a flowchart showing an example of the operation of the medical system according to the first embodiment.
  • FIG. 10 is a schematic diagram showing a medical system according to the second embodiment.
  • FIG. 11 is a schematic diagram showing a medical system according to the third embodiment.
  • FIG. 12 is a schematic diagram showing a medical system according to the fourth embodiment.
  • FIG. 13 is a schematic diagram showing a medical system according to the fifth embodiment.
  • FIG. 14 is a schematic diagram showing a medical system according to the sixth embodiment.
  • FIG. 1 is a block diagram showing an overall configuration of the medical system 100 according to the first embodiment.
  • the medical system 100 includes an evaluation device 1 and a medical device 2.
  • the medical device 2 may be connected to the evaluation device 1 via, for example, a public communication network 4 (network), or may be connected to, for example, a server 3.
  • the medical device 2 may be connected to the evaluation device 1 by wire.
  • the medical system 100 is used when diagnosing a human or animal subject.
  • the evaluation device 1 inputs input information including the target image, and outputs an evaluation result based on the input input information. As shown in FIG. 2A, for example, the evaluation device 1 outputs an evaluation result based on the input information including the first image of the object irradiated with the irradiation light. Further, as shown in FIG. 2B, for example, the evaluation device 1 outputs the evaluation result based on the input information including the target two-dimensional first image and the target three-dimensional second image.
  • the evaluation result includes disease information regarding a target disease such as influenza. Based on the evaluation results, for example, the user can assist in diagnosing a human or animal disease.
  • the evaluation device 1 refers to the reference database when outputting the evaluation result.
  • the reference database stores the past input information acquired in advance, the reference information used for evaluating the past input information, and the degree of association between the past input information and the reference information.
  • the evaluation device 1 refers to the reference database, calculates the first association between the input information and the reference information, and generates an evaluation result based on the evaluation information including the first association.
  • the input information includes the first image which is the target two-dimensional image.
  • the input information may further include a second image, imaging condition information, medical record data, measurement data, and audio data.
  • the input information includes inspection information including the first image.
  • the inspection information may further include a second image, measurement data, and audio data.
  • the first image is captured by the first imaging unit 22.
  • the first image may be a still image or a moving image.
  • the first image may include audio data.
  • the first image may be a two-dimensional image of the object irradiated by the irradiation unit 21.
  • the first image may be a two-dimensional image of an object that has not been irradiated by the irradiation unit 21.
  • the first image may include incidental information regarding the abnormal portion.
  • the incidental information includes part information indicating the abnormal part, discoloration information regarding the discoloration status of the abnormal portion, and shape information regarding the shape status of the abnormal portion.
  • the first image including the incidental information is generated by inputting the incidental information into the first image captured by the first imaging unit 22, for example, by the input unit 15.
  • Ancillary information may be displayed on the first image.
  • the discoloration information and the shape information may include, for example, a severity score based on the guidelines of the Japanese Society of Oral and Pharyngeal Science.
  • the second image is captured by the second imaging unit 23.
  • the second image is a three-dimensional image of the object in the region corresponding to the first image.
  • the second image may be a still image or a moving image.
  • the second image may include audio data.
  • the second image may include incidental information.
  • the imaging condition information includes information on the imaging conditions of the first image and information on the imaging conditions of the second image.
  • the imaging condition information includes information on the distance to the target, the illuminance of the irradiation unit 21, the imaging range, and the like.
  • the medical record data shows the data of the medical record of the disease.
  • the medical record data includes, for example, blood pressure, body temperature, physical findings, laboratory findings, and the like.
  • the measurement data is measured by the measuring unit 24.
  • the measurement unit 24 may include the measurement data by using a biochemical reaction such as an ELISA (Enzyme-Linked ImmunoSorbent Assay) method or an enzyme method.
  • the measurement data includes measurement data measured by using a biochemical reaction such as an electrochemical method.
  • the measurement data may include measurement data measured by spectroscopic techniques.
  • the measurement data may include infrared spectrum data.
  • the measurement data may include measurement data measured using the natural frequency.
  • the measurement data may include measurement data measured using electrophoresis.
  • the measurement data may include temperature data of interest.
  • the reference information includes at least one of disease information, prescription drug information, biological information and treatment information related to the target disease.
  • the reference information may further include advertising information.
  • Disease information includes, for example, information on influenza.
  • Disease information includes information on diseases in the oral cavity such as inflammation of the oral mucosa, erosion, ulcer, tumor, exaggeration of tonsils, white moss, Koplik's spot pharyngitis, jaw bone necrosis, and oral mucositis due to chemotherapy.
  • Disease information includes, for example, information about anemia.
  • Disease information includes, for example, information about body temperature.
  • Disease information includes, for example, information on respiratory arrest, cardiac arrest, deep coma, pupil diameter, pupil fixation, brainstem reflex, light reflex, spontaneous reflex, and the like.
  • the disease information includes information on eye diseases such as inflammation of the eyeball and conjunctiva, erosion, ulcer, tumor, hyperemia, bleeding, edema and jaundice of the eyeball, eye protrusion, pupil diameter, and the like.
  • Disease information includes information on nasal diseases such as inflammation in the nasal cavity, erosions, ulcers, tumors, wounds, bleeding, etc.
  • Disease information includes information on ear diseases such as inflammation of the ear and eardrum, erosion, ulcer, tumor, wound, etc.
  • Disease information includes information on cervical diseases such as swelling of the cervical and supraclavicular lymph nodes, cervical aneurysms, and jugular vein distension.
  • Disease information includes information on lip diseases such as lip inflammation, erosions, ulcers, tumors, wounds, burns, cyanosis, etc.
  • Disease information includes information about Parkinson's disease.
  • Disease information includes information about Cushing's syndrome.
  • Disease information includes information on breast and nipple diseases such as inflammation of the breast and nipple, erosion, ulcer, tumor, and wound.
  • Disease information includes information on diseases of the chest and axillary lymph nodes such as inflammation of the skin and axillary lymph nodes of the chest, erosions, ulcers, tumors, wounds and respiratory muscle fatigue.
  • Disease information includes information on skin diseases such as skin inflammation, erosions, ulcers, tumors, masses, wounds, sunburns, burns, frostbite, necrosis, bleeding, stains, rashes, hair loss, and tones.
  • the disease information includes information on diseases of the abdomen and inguinal lymph nodes such as abdominal skin, inflammation of inguinal lymph nodes, erosions, tumors, and wounds.
  • Disease information includes information on genitourinary diseases such as inflammation of the genitourinary system (pudendal region, urinary meatus, vagina, etc.), erosion, ulcer, tumor, wound, internal hemorrhoid, external hemorrhoid, anal fistula, etc.
  • Disease information includes information on lower limb diseases such as lower limb edema, varicose veins, necrosis, lymphedema and the like.
  • the disease information includes information on diseases of the upper limbs and fingers such as edema of the upper limbs, varicose veins, necrosis, lymphedema, and palmar erythema.
  • the disease information includes information on changes in the color tone of the nail and diseases of fingers such as clubbing and nails.
  • Prescription drug information is information on prescription drugs that should be prescribed to the subject.
  • the prescription drug information is, for example, proposing a therapeutic drug related to a target disease, and specifically, is information on a prescription drug that "Tamiflu corresponds to a therapeutic drug for influenza". This information can be transmitted not only by displaying the text on the output portion 109 of the display or the like, but also by voice.
  • Biological information includes information on human or animal genes, information on body temperature, and information on heart rate.
  • Treatment information is information about treatment.
  • the treatment information is, for example, information on treatment such as gargling, taking vitamins, and securing sleep time.
  • Advertising information is information related to advertising for a given product or service.
  • the advertising information is, for example, information related to advertisements for throat lozenges, gummies, supplements, mouthwashes, and the like.
  • FIG. 4A is a schematic view showing an example of the configuration of the evaluation device 1.
  • an electronic device such as a smartphone or a personal computer (PC) is used.
  • the evaluation device 1 may be integrally formed with, for example, any device capable of acquiring input information.
  • the evaluation device 1 includes a housing 10, a CPU 101, a ROM 102, a RAM 103, a storage unit 104, and I / F 105 to 107. Each configuration 101 to 107 is connected by an internal bus 110.
  • the CPU (Central Processing Unit) 101 controls the entire evaluation device 1.
  • the ROM (Read Only Memory) 102 stores the operation code of the CPU 101.
  • the RAM (Random Access Memory) 103 is a work area used during the operation of the CPU 101.
  • the storage unit 104 stores various information such as images.
  • As the storage unit 104 for example, in addition to an HDD (Hard Disk Drive), a data storage device such as an SSD (solid state drive) is used.
  • the evaluation device 1 may have a GPU (Graphics Processing Unit) (not shown). Having a GPU enables faster arithmetic processing than usual.
  • the I / F 105 is an interface for transmitting and receiving various information to and from the server 3 and the like via the public communication network 4.
  • the I / F 106 is an interface for transmitting / receiving information to / from the input portion 108.
  • a keyboard is used as the input portion 108, and a user or the like of the medical system 100 inputs various information or a control command of the evaluation device 1 via the input portion 108.
  • the I / F 107 is an interface for transmitting and receiving various information to and from the output portion 109.
  • the output unit 109 outputs various information stored in the storage unit 104, the processing status of the evaluation device 1, and the like.
  • a display is used as the output portion 109, and for example, a touch panel type may be used.
  • FIG. 4B is a schematic diagram showing an example of the function of the evaluation device 1.
  • the evaluation device 1 includes an information DB 11, an acquisition unit 12, an evaluation unit 13, an output unit 14, and an input unit 15.
  • the evaluation device 1 may include, for example, an update unit 16.
  • the function shown in FIG. 3B is realized by the CPU 101 executing a program stored in the storage unit 104 or the like using the RAM 103 as a work area. Further, each configuration 11 to 16 may be controlled by, for example, artificial intelligence.
  • the information DB 11 stores a reference database in which the past input information acquired in advance and the reference information used for the past input information are stored.
  • the information DB 11 stores various information such as input information, evaluation information including the first degree of association between the input information and the reference information, the evaluation result generated based on the evaluation information, and a format for displaying the evaluation result. Will be done.
  • the reference database and various information are stored as a database of various information in the storage unit 104 embodied in HDD, SSD, or the like.
  • Each configuration 12 to 16 stores various information in the information DB 11 or retrieves various information as needed.
  • the reference database stores three or more levels of association between the past input information and the reference information.
  • the reference database is formed by, for example, an algorithm that can calculate the degree of association.
  • the past input information and the reference information have a plurality of data, and the relationship between each past data and each reference data is linked by the degree of association.
  • the first image A included in the past input information shows the degree of association "80%” with the disease A included in the reference information, and the degree of association "15” with the disease B included in the reference information. % ”. That is, the "degree of association" indicates the degree of connection between each data, and the higher the degree of association, the stronger the connection of each data.
  • past input information and reference information may be stored in the reference database in the form of video, or may be stored in the form of, for example, a numerical value, a matrix (vector), or a histogram.
  • the degree of association is calculated using, for example, machine learning.
  • machine learning for example, deep learning is used.
  • the degree of association is calculated by machine learning using the past input information and the reference information as a data set, and the reference database is constructed.
  • the past input information is, for example, a combination of at least one of the past first image and the past second image, the past imaging condition information, the past chart data, and the past measurement data.
  • the degree of association is calculated based on the relationship with the reference information. For example, the combination of the first image A included in the first image in the past and the second image A included in the second image in the past shows a degree of association with disease A of "80%" and is associated with disease B. The degree of association between them is "15%".
  • the data of the first image in the past and the second image in the past can be stored independently. Therefore, when generating an evaluation result based on the input information described later, it is possible to improve the accuracy and expand the range of options.
  • the past input information is a combination of at least one of the past first image and the past second image, the past imaging condition information, the past chart data, and the past measurement data.
  • the degree of association is calculated based on the relationship with the reference information.
  • the past measurement data is omitted in the past input information.
  • the reference information includes disease information, prescription drug information, treatment information and advertising information.
  • the combination of the first image B included in the past first image, the chart data A included in the past chart data, and the imaging condition A included in the past imaging condition information is between the disease B.
  • the degree of association with the treatment A is "50%”
  • the degree of association with the treatment A is "80%”
  • the degree of association with the advertisement B is "20%”.
  • the past input information can be stored independently. Therefore, when generating an evaluation result based on the input information described later, it is possible to improve the accuracy and expand the range of options.
  • the acquisition unit 12 generates input information to be evaluated and acquires the input information.
  • the acquisition unit 12 may acquire the input information generated by the medical device 2.
  • the medical device 2 may acquire input information via a storage medium such as a public communication network 4 or a portable memory.
  • the format of the input information is arbitrary, and for example, the acquisition unit 12 may convert it to an arbitrary file format.
  • the acquisition unit 12 may acquire the examination information from the medical device 2, generate the input information including the acquired examination information, and acquire the generated input information.
  • the evaluation unit 13 acquires evaluation information including a first degree of association of three or more levels between the input information and the reference information.
  • the evaluation unit 13 refers to a reference database, selects past input information that matches or is similar to the input information, and calculates the degree of association associated with the selected past input information as the first degree of association. ..
  • the evaluation unit 13 may use, for example, a reference database as an algorithm of the classifier to calculate the first degree of association between the input information and the reference information.
  • the degree of association between the two is calculated as the first degree of association.
  • a value obtained by multiplying the degree of association between the first image A and the second image B and the reference data by an arbitrary coefficient may be calculated as the first degree of association.
  • the evaluation unit 13 After calculating the first degree of association, acquires the input information, the reference information, and the evaluation information including the first degree of association.
  • the evaluation unit 13 may calculate the first degree of association by referring to the reference database shown in FIG. 5, for example.
  • the output unit 14 generates an evaluation result based on the evaluation information and outputs the evaluation result.
  • the output unit 14 generates an evaluation result for the input information, for example, based on the first degree of association of the evaluation information. Further, the output unit 14 may be generated as an evaluation result without performing, for example, processing of evaluation information.
  • the evaluation result may include input information, reference information, and first degree of association.
  • the output unit 14 outputs the generated evaluation result.
  • the output unit 14 may output the evaluation result to the output unit 109 via the I / F 107, or may output the evaluation result to any device via the I / F 105, for example.
  • the output unit 14 may convert the voice data collected by the sound collecting unit 25 into text, and output the text-written voice data together with the evaluation result.
  • the site to be the lesion may be clearly shown in the first image in the evaluation result. As a result, it is possible for doctors and the like to save the trouble of writing medical records.
  • the input unit 15 receives the input information transmitted from the medical device 2 via the I / F 105, and also receives various information input from the input portion 108 via the I / F 106, for example.
  • the input unit 15 may receive, for example, the input information stored in the server 3.
  • the input unit 15 may receive input information or the like via a storage medium such as a portable memory.
  • the input unit 15 receives, for example, update data created by the user based on the evaluation result, learning data used for updating the degree of association, and the like.
  • Update part 16 For example, when the update unit 16 newly acquires the relationship between the past input information and the reference information, the update unit 16 reflects the relationship in the degree of association.
  • the data to be reflected in the degree of association for example, updated data including input information newly created by the user and reference information corresponding to the input information is used.
  • learning data created by the user based on the evaluation result is used.
  • the sound collecting unit 25 collects the voice of the doctor who muttered "It's influenza" when examining a patient suspected of having influenza as voice data.
  • the sound collecting unit 25 converts the collected voice data into text, performs natural language processing, extracts disease information tweeted by a doctor, and generates a label as reference information according to a reference data set. These are processed (processed) audio data. Any one of these combinations is recorded in the information DB 11 or the like by, for example, the input unit 15.
  • the update unit 16 includes a mechanism that assists in inserting the input information and the label as a new record in the data set.
  • the medical device 2 includes an inspection information generation unit 20 and a holding unit 26.
  • the medical device 2 can send and receive various information to and from the evaluation device 1 via, for example, a public communication network 4. Further, the medical device 2 can generate input information and transmit it to the evaluation device 1.
  • the medical device 2 may include, for example, the configuration of the evaluation device 1.
  • the medical device 2 is used, for example, when examining the oral cavity of a target. The medical device 2 examines, for example, the forehead, eyeball, tongue, ear, nasal cavity, neck, face, axilla, breast, nipple, chest, abdomen, lower limbs, upper limbs, fingers, body fluids, urinary organs, genital organs, and whole skin.
  • the shape of the medical device 2 is formed, for example, in a pen shape.
  • the shape of the medical device 2 may be formed, for example, a pistol type, a bridge type, an endoscopic type, a capsule type, or the like.
  • the inspection information generation unit 20 includes an irradiation unit 21, a first imaging unit 22, a second imaging unit 23, a measuring unit 24, and a sound collecting unit 25.
  • the inspection information generation unit 20 generates inspection information and generates input information including inspection information.
  • the irradiation unit 21 irradiates light, and for example, a light source such as a visible light source or an infrared light source LED is used.
  • the first imaging unit 22 captures the target two-dimensional first image and transmits it to the evaluation device 1.
  • the first imaging unit 22 may capture a two-dimensional first image of the object irradiated by the irradiation unit 21.
  • the first imaging unit 22 images the lesion portion of the target.
  • the second imaging unit 23 captures a three-dimensional second image of the target in the region corresponding to the first image and transmits it to the evaluation device 1.
  • the second imaging unit 23 can acquire the unevenness data of the target by the three-dimensional second image.
  • the second imaging unit 23 images the lesion portion of the target.
  • a predetermined sensor is used in the measurement unit 24 to measure the measurement data of the target and transmit it to the evaluation device 1.
  • the measuring unit 24 may measure the measurement data by using a biochemical reaction such as an ELISA (Enzyme-Linked ImmunoSorbent Assay) method or an enzyme method.
  • a biochemical reaction such as an ELISA (Enzyme-Linked ImmunoSorbent Assay) method or an enzyme method.
  • an enzyme, an antibody, or the like is immobilized on the surface of the sensor, and when it reacts with a substance contained in the sample, the amount of color development changes due to a chemical reaction between the enzyme and the color former.
  • the color development is measured by an optical sensor, an image sensor, visual inspection, or the like.
  • the measurement data may be measured by using a biochemical reaction such as the measuring unit 24 or the electrochemical method.
  • a biochemical reaction such as the measuring unit 24 or the electrochemical method.
  • an enzyme, an antibody, or the like is immobilized on the surface of the sensor, and when it reacts with a substance contained in the sample, electrons move on the surface of the sensor due to a chemical reaction between the enzyme and the color former.
  • a substance can be identified by measuring the movement of its electrons.
  • the measuring unit 24 may measure the measured data by, for example, a spectroscopic technique.
  • a substance can be identified by transmitting a light source in an infrared region or a near infrared region and measuring the absorbance of the transmitted light, for example.
  • infrared spectrum data such as the inside of the oral cavity of the target may be measured.
  • the measuring unit 24 may measure the measurement data using the natural frequency.
  • the natural frequency can be measured by binding to a specific substance on the sensor surface and applying vibration at a fixed cycle.
  • a trace molecule can be measured by this natural frequency to identify a substance.
  • the measuring unit 24 may measure the measurement data by using electrophoresis.
  • a voltage is applied between the flow paths in the sensor to perform electrophoresis.
  • molecules move in the flow path, and the measurement target can be specified by the molecular weight.
  • the measuring unit 24 may measure the target temperature data. Further, the measuring unit 24 may measure the distance to the target.
  • the holding unit 26 is for holding the target in a predetermined state.
  • a tongue depressor or a mouthpiece for holding the target oral cavity in an expanded state is used.
  • the sound collecting unit 25 collects the target voice data. As the voice data, the voice data of the target at the time of examination, medical examination, etc. is collected.
  • the sound collecting unit 25 may be included in the first imaging unit 22.
  • the sound collecting unit 25 may convert the collected voice data into text data.
  • the sound collecting unit 25 converts the collected voice data into text, performs natural language processing, extracts disease information, and generates a label according to the reference data set.
  • ⁇ Server 3> Data (database) related to various information is stored in the server 3.
  • Information sent via, for example, the public communication network 4 is stored in this database.
  • the server 3 stores the same information as the information DB 11, and may send and receive various information to and from the evaluation device 1 via the public communication network 4.
  • a database server on the network may be used.
  • the server 3 may be used in place of the storage unit 104 and the information DB 11 described above.
  • the public communication network 4 (network) is an Internet network or the like to which the evaluation device 1 and the like are connected via a communication circuit.
  • the public communication network 4 may be composed of a so-called optical fiber communication network. Further, the public communication network 4 is not limited to the wired communication network, and may be realized by a wireless communication network.
  • FIG. 9 is a flowchart showing an example of the operation of the medical system 100 in the present embodiment.
  • the inspection information generation unit 20 generates inspection information including the first image (inspection information generation step S110).
  • the inspection information generation step S110 includes an irradiation step S111, a first imaging step S112, a second imaging step S113, and a measurement step S114.
  • the irradiation unit 21 irradiates an object that is a human or an animal with irradiation light (irradiation step S111). At this time, the holding portion 26 holds the target oral cavity in an expanded state.
  • the irradiation unit 21 stores the illuminance to be irradiated in the information DB 11 as imaging condition information.
  • the inspection information generation unit 20 generates imaging condition information.
  • First imaging step S112 Next, the first imaging unit 22 acquires a two-dimensional first image in the oral cavity of the target irradiated by the irradiation unit 21 (first imaging step S112). At this time, the first imaging unit 22 may acquire audio data. The first imaging unit 22 stores the first image in the information DB 11. The medical device 2 generates test information including the first image.
  • the second imaging unit 23 acquires a three-dimensional second image in the target oral cavity (second imaging step S113). At this time, the second imaging unit 23 may acquire audio data. The second imaging unit 23 stores the second image in the information DB 11. The medical device 2 generates test information including a second image.
  • the second imaging step S113 may be omitted. Further, when performing the second imaging step S113, the irradiation step S111 may be omitted.
  • the measurement unit 24 acquires the target measurement data (measurement step S114).
  • the measurement unit 24 stores the measurement data in the information DB 11.
  • the medical device 2 generates test information including measurement data.
  • the measurement step S114 may be omitted if necessary.
  • the order of the first imaging step S112, the second imaging step S113, and the measurement step S114 is arbitrary.
  • the medical device 2 generates input information including examination information including imaging condition information, a first image, a second image, and measurement data, and transmits the generated input information to the evaluation device 1.
  • the timing of transmitting the input information to the evaluation device 1 may be sequentially performed after the completion of each of the irradiation step S111, the first imaging step S112, the second imaging step S113, and the measurement step S114.
  • the inspection information generation unit 20 may generate shooting condition information and medical record data as necessary to generate input information including the shooting condition information and medical record data.
  • the acquisition unit 12 acquires input information including inspection information (acquisition step S120).
  • the acquisition unit 12 acquires the inspection information generated by the medical device 2 via the input unit 15, and may also acquire the input information via a storage medium such as a public communication network 4 or a portable memory. Good. Further, the acquisition unit 12 acquires the inspection information, the imaging condition information, and the medical record data generated by the medical device 2 via the input unit 15, and also via a storage medium such as a public communication network 4 or a portable memory. Then, inspection information, imaging condition information, and medical record data may be acquired.
  • the acquisition unit 12 may store the acquired input information and the like in the information DB 11.
  • the timing at which the acquisition unit 12 acquires the input information from the medical device 2 is arbitrary.
  • the imaging conditions and the like of the first image acquired by the acquisition unit 12 are arbitrary.
  • the acquisition unit 12 may acquire input information including at least one of the first image and the second image, imaging condition information, medical record data, and measurement data.
  • the acquisition unit 12 may acquire input information including, for example, a first image and a second image.
  • the acquisition unit 12 may acquire input information including, for example, a first image, imaging condition information, and a second image.
  • the acquisition unit 12 may acquire, for example, the first image, the imaging condition information, the second image, and the input information including the measurement data.
  • the evaluation unit 13 refers to the reference database and acquires the evaluation information including the first degree of association between the input information and the reference information (evaluation step S130).
  • the evaluation unit 13 acquires the input information from the acquisition unit 12 and acquires the reference database from the information DB 11.
  • the evaluation unit 13 can calculate the first degree of association between the input information and the reference information by referring to the reference database.
  • the evaluation unit 13 selects, for example, a past first image that matches, partially matches, or is similar to the input information, calculates the first degree of association based on the corresponding degree of association, and uses, for example, a reference database as a classifier. It may be used as an algorithm to calculate the first degree of association.
  • the evaluation unit 13 may store the calculated first degree of association and the acquired evaluation information in the information DB 11.
  • the evaluation unit 13 refers to, for example, the reference database shown in FIG. 6, and refers to the combination of at least one of the first image and the second image, the imaging condition information, and the measurement data, and the first association between the reference information.
  • the degree may be calculated.
  • the evaluation unit 13 may refer to the reference database shown in FIG. 6, for example, and calculate the first degree of association between the combination of the first image and the second image and the reference information.
  • the acquisition unit 12 may refer to, for example, the reference database shown in FIG. 6 and calculate the combination of the first image and the imaging condition information and the first degree of association between the reference information.
  • the acquisition unit 12 may refer to the reference database shown in FIG. 6 and calculate the combination of the first image, the measurement data, the imaging condition information, and the first degree of association between the reference information.
  • the evaluation unit 13 refers to, for example, the reference database shown in FIG. 7, and has a combination of at least one of the first image and the second image, imaging condition information, chart data, and measurement data, and the reference information.
  • the first degree of association may be calculated.
  • the evaluation unit 13 may refer to the reference database shown in FIG. 7, for example, and calculate the first degree of association between the combination of the first image and the second image and the reference information.
  • the acquisition unit 12 may refer to the reference database shown in FIG. 7 and calculate the combination of the first image, the chart data, the imaging condition information, and the first degree of association between the reference information.
  • the output unit 14 generates an evaluation result based on the evaluation information and outputs the evaluation result (output step S140).
  • the output unit 14 may acquire evaluation information from the evaluation unit 13 or the information DB 11, and may acquire, for example, a format for displaying the evaluation result from the information DB 11.
  • the output unit 14 generates an evaluation result by referring to a predetermined format such as a text format based on the evaluation information.
  • the output unit 14 outputs the evaluation result.
  • the output unit 14 outputs the evaluation result to the output unit 109.
  • the sound collecting unit 25 collects the voice of the doctor who muttered "It's influenza" when examining a patient suspected of having influenza as voice data.
  • the collected voice data is converted into text and processed in natural language.
  • the output unit 14 may output the voice data converted into text to the output unit 109 together with the output result.
  • the output unit 14 may output an evaluation result based on, for example, a result of comparing a preset notification reference value with the first degree of association.
  • a preset notification reference value for example, when the notification reference value is set to "90% or more", the evaluation result is output only when the first degree of association is 90% or more. That is, the notification reference value is an arbitrary threshold value, and conditions such as above or below the notification reference value can be arbitrarily set.
  • Update step S150 After that, when the update unit 16 newly acquires the relationship between the past input information and the reference information, the relationship may be reflected in the degree of association (update step S150). For example, the update unit 16 acquires update data newly created by the user and reflects it in the degree of association. In addition, the update unit 16 acquires, for example, learning data created by the user based on the evaluation result and reflects it in the degree of association. The update unit 16 calculates and updates the degree of association using machine learning, for example, and a convolutional neural network is used for machine learning, for example.
  • the operation of the medical system 100 in the present embodiment is completed. It is optional whether or not the above-mentioned update step S150 is performed. Further, as a medical program in the present embodiment, the computer may be made to execute the above operation.
  • the evaluation unit 13 refers to the reference database and acquires the evaluation information including the first degree of association between the input information including the first image and the reference information. Therefore, the reference information is associated with the input information, and the quantitative evaluation result of the disease can be obtained based on the input information. This makes it possible to improve the accuracy in disease evaluation. As a result, even a non-specialist can be embodied as a highly versatile device that is easy to use.
  • the input information includes a first image which is a two-dimensional image and a second image which is a three-dimensional image. Therefore, as compared with the case where either one of the first image and the second image is used, the disease can be evaluated based on the input information including the first image and the second image. This makes it possible to further improve the accuracy of disease evaluation.
  • the input information further includes the imaging condition information. Therefore, in addition to the first image, the disease can be evaluated based on the input information that further considers the imaging conditions. This makes it possible to further improve the accuracy of disease evaluation. In addition, the disease can be evaluated based on the input information including the standardized image in which the imaging condition information is prepared, and the disease can be evaluated with higher accuracy.
  • the input information further includes measurement data. Therefore, in addition to the first image, the disease can be evaluated based on the input information further considering the measurement data. This makes it possible to further improve the accuracy of disease evaluation.
  • the input information further includes medical record data. Therefore, in addition to the first image, the disease or the like can be evaluated based on the input information further considering the medical record data. This makes it possible to further improve the accuracy of evaluation of diseases and the like.
  • the first image or the second image includes incidental information. Therefore, it is possible to evaluate the disease or the like with higher accuracy than using the first image or the second image without incidental information. This makes it possible to further improve the accuracy of evaluation of diseases and the like.
  • the reference information includes disease information. Therefore, the reference information is associated with the input information, and the quantitative evaluation result regarding the disease can be obtained based on the input information. This makes it possible to improve the accuracy in disease evaluation. As a result, even a non-specialist can be embodied as a highly versatile device that is easy to use.
  • the reference information includes prescription drug information. Therefore, the reference information is linked to the input information, and the quantitative evaluation result of the prescription drug prescribed to the subject can be obtained based on the input information. This makes it possible to improve the accuracy in evaluating the prescription drug prescribed to the subject. As a result, even a non-specialist can be embodied as a highly versatile device that is easy to use.
  • the reference information includes the treatment information. Therefore, the reference information is associated with the input information, and the quantitative evaluation result regarding the treatment for the target can be obtained based on the input information. This makes it possible to improve the accuracy in evaluating the treatment for the subject. As a result, even a non-specialist can be embodied as a highly versatile device that is easy to use.
  • the reference information further includes advertising information. Therefore, the reference information is associated with the input information, and the quantitative evaluation result of the product or service to be advertised to the target can be obtained based on the input information. This makes it possible to propose optimal products and services to the target.
  • the holding unit 26 for holding the target in a predetermined state is provided.
  • the first image or the like can be captured while the object is held in a constant state. Therefore, it is possible to reduce the burden on the target when capturing the first image or the like.
  • the target is provided with an irradiation unit 21 that irradiates a predetermined irradiation light.
  • the target can be illuminated when the first image or the like is captured. Therefore, a predetermined illuminance condition can be satisfied regardless of the surrounding environment. As a result, the disease can be evaluated with higher accuracy.
  • the irradiation unit 21 irradiates the target oral cavity. This makes it possible to illuminate the oral cavity when the first image is taken. Therefore, the illuminance condition of the first image in the oral cavity can be satisfied, and the disease can be evaluated with higher accuracy. In particular, the oral cavity exhibits various mucosal findings and has a dark field. Therefore, by providing the irradiation unit 21, it is possible to evaluate the disease with higher accuracy.
  • the holding portion 26 is a tongue depressor or a mouthpiece for holding the target oral cavity in an expanded state. Therefore, the first image can be captured while the target's oral cavity is held in an expanded state, and the burden on the target when capturing the first image in the oral cavity can be reduced. Therefore, even a patient with a strong pharyngeal reflex or a pediatric patient can take a more reliable image.
  • the output unit 14 outputs an evaluation result based on the result of comparing the preset notification reference value with the first degree of association. Therefore, it is possible to control whether or not to output the evaluation result according to the setting condition of the notification reference value. As a result, the evaluation result can be output only when it is necessary, and the user or the like does not have to check the evaluation result at all times. Therefore, it is possible to reduce the workload of the user and the like.
  • the update unit 16 when the update unit 16 newly acquires the relationship between the past input information and the reference information, the update unit 16 reflects the relationship in the degree of association. Therefore, the degree of association can be easily updated, and the accuracy of evaluation can be improved.
  • the first image capturing unit 22 can capture the first image
  • the second imaging unit 23 can capture the second image
  • the measurement data of blood or the like can be measured at the same time. ..
  • deterioration of the sample can be prevented without transporting the sample such as blood for measuring the measurement data to the analysis room or the like. Therefore, the accuracy of disease evaluation can be improved, and it can be performed immediately in a medical examination room or the like.
  • the first imaging unit 22, the second imaging unit 23, the measuring unit 24, and the like are provided.
  • an evaluation result is generated based on the target two-dimensional first image, three-dimensional second image, and measurement data.
  • a portable device such as a smartphone as the evaluation device 1, it is possible to evaluate the disease regardless of the place and time.
  • the evaluation video data is evaluated based on the degree of association (first degree of association) set in three or more stages.
  • the degree of association can be described by a numerical value from 0 to 100%, for example, but is not limited to this, and may be configured at any stage as long as it can be described by a numerical value of 3 or more stages.
  • the present embodiment it is possible to evaluate without overlooking even when the degree of association is extremely low, for example, 1%. Even if the reference information has an extremely low degree of association, it is shown that the reference information is connected as a slight sign, and it is possible to suppress oversight and misunderstanding.
  • FIG. 10 is a schematic view showing the medical system 100 according to the second embodiment.
  • the evaluation device 1 for example, an electronic device such as a smartphone is used.
  • the evaluation device 1 includes a first imaging unit 22.
  • the medical device 2 is, for example, a medical device related to the urinary system.
  • the medical device 2 has an irradiation unit 21.
  • the medical device 2 may include an examination information generation unit 20.
  • FIG. 11 is a schematic view showing the medical system 100 according to the third embodiment.
  • the evaluation device 1 for example, an electronic device such as a smartphone is used, and the evaluation device 1 has an inspection information generation unit 20 including a first imaging unit 22.
  • the inspection information generation unit 20 may include a second imaging unit 23, a measurement unit 24, a sound collection unit 25, and the like.
  • the medical device 2 is, for example, a medical device related to the oral cavity.
  • the medical device 2 has a holding portion 26 made of a tongue depressor.
  • the medical device 2 has an irradiation unit 21 at the tip of the holding unit 26.
  • the medical device 2 further includes a handle portion 201, a trigger portion 202, and a support portion 203.
  • the medical device 2 may include an examination information generation unit 20.
  • the handle portion 201 is for holding the medical device 2 by the user's hand, and is formed in a predetermined shape such as a tubular shape.
  • the trigger portion 202 is connected to the handle portion 201 and can be pulled by the user's finger. By pulling the trigger unit 202, the user executes various functions such as the inspection information generation unit 20 and the evaluation device 1. For example, by pulling the trigger unit 202, the first imaging unit 22 is operated to capture the first image.
  • the support portion 203 is connected to the handle portion 201 and supports the evaluation device 1 in a predetermined state.
  • the support portion 203 can be fixed by sandwiching the evaluation device 1 made of an electronic device such as a smartphone, for example.
  • a holding portion 26 made of a tongue depressor can be attached to the support portion 203 on the opposite side of the handle portion 201.
  • the medical device 2 executes the function of the inspection information generation unit 20 by being connected to the handle unit 201 for the user to hold and pulling with the user's finger. It has a trigger portion 202 and a support portion 203 that is connected to the handle portion 201 to support the evaluation device 1.
  • the medical device 2 is formed in a so-called gun shape.
  • the inspection information generation unit 20 such as the first imaging unit 22 functions, it can be executed by pulling the trigger unit 202. .. Therefore, it is possible to improve the convenience when the user uses it.
  • FIG. 12 is a schematic view showing the medical system 100 according to the fourth embodiment.
  • the medical device 2 is, for example, a medical device related to the oral cavity.
  • the medical device 2 has a holding portion 26 made of a tongue depressor.
  • the medical device 2 may have an irradiation unit 21 in the holding unit 26 and may be embodied as a so-called glowing tongue depressor.
  • a holding portion 26 is attached to the tip of the handle portion 201, and a first imaging portion 22 is arranged and attached toward the holding portion 26.
  • the medical device 2 may include an examination information generation unit 20.
  • FIG. 13 is a schematic view showing the medical system 100 according to the fifth embodiment.
  • the medical device 2 is, for example, a medical device related to the oral cavity.
  • the medical device 2 has a holding portion 26 made of a tongue depressor.
  • the medical device 2 may have an irradiation unit 21 in the holding unit 26 and may be embodied as a so-called glowing tongue depressor.
  • the holding unit 26 has a plurality of first imaging units 22 on both sides thereof. Further, the holding unit 26 has a first imaging unit 22 at the tip end portion.
  • the medical device 2 may include an examination information generation unit 20.
  • the medical device 2 has a plurality of first imaging units 22 on both sides of the holding unit 26. As a result, when the image of the oral cavity is captured, the images of the upper side and the lower side of the oral cavity can be captured at the same time.
  • FIG. 14 is a schematic view showing the medical system 100 according to the sixth embodiment.
  • the medical device 2 is, for example, a medical device related to the skin.
  • the medical device 2 has a cover member 204 for covering the skin S.
  • the cover member 204 is formed, for example, in the shape of a hemisphere.
  • the irradiation unit 21 and the first imaging unit 22 are provided inside the cover member 204.
  • the medical device 2 may include an examination information generation unit 20.
  • the medical device 2 is connected to the evaluation device 1 via, for example, a cable or the like.
  • the medical device 2 has a cover member 204 for covering the skin S, and has an irradiation unit 21 and a first imaging unit 22 inside the cover member 204. This makes it possible to reduce the influence of ambient light when capturing an image of the skin. Therefore, the evaluation of the disease can be performed more accurately.
  • Evaluation device 10 Housing 11: Information DB 12: Acquisition unit 13: Evaluation unit 14: Output unit 15: Input unit 16: Update unit 2: Medical device 20: Inspection information generation unit 21: Irradiation unit 22: First imaging unit 23: Second imaging unit 24: Measurement unit 25: Sound collecting unit 26: Holding unit 201: Handle unit 202: Trigger unit 203: Support unit 204: Cover member 3: Server 4: Public communication network 100: Medical system 101: CPU 102: ROM 103: RAM 104: Storage unit 105: I / F 106: I / F 107: I / F 108: Input part 109: Output part 110: Internal bus S110: Inspection information generation step S111: Irradiation step S112: First imaging step S113: Second imaging step S114: Measurement step S120: Acquisition step S130: Evaluation step S140: Output step S150: Update step

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • Public Health (AREA)
  • Medical Informatics (AREA)
  • Biomedical Technology (AREA)
  • Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Biophysics (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Veterinary Medicine (AREA)
  • Animal Behavior & Ethology (AREA)
  • Optics & Photonics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Molecular Biology (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

[Problem] To provide a medical system that can improve accuracy in disease assessment on the basis of images. [Solution] This medical system is provided with: an irradiation means which irradiates a human or animal target with irradiation light; a first imaging means which captures a first image of the target irradiated with the irradiation light; an acquisition means which acquires input information including said first image; a reference database which stores past input information, reference information corresponding to the past input information, and a three-stage or higher association degree between the past input information and the reference information; an assessment means which refers to the reference database and generates assessment information which includes a three-stage or higher first association degree between the input information acquired by the acquisition means and the reference information; and an output means which generates assessment results on the basis of the assessment information and outputs the assessment results, wherein the reference information includes disease information relating to a disease.

Description

医療システム、医療デバイス及び医療プログラムMedical systems, medical devices and medical programs
 本発明は、人又は動物である対象の診断を支援するための医療システム、医療デバイス、及び医療プログラムに関する。 The present invention relates to medical systems, medical devices, and medical programs to assist in the diagnosis of a human or animal subject.
 従来、画像を用いて患者を観察する技術として、特許文献1、2の開示技術が提案されている。 Conventionally, disclosure techniques of Patent Documents 1 and 2 have been proposed as techniques for observing patients using images.
 特許文献1の口腔挿入具は、長尺状のスコープ部を備えるモニタ装置に装着して、口腔内に挿入される口腔挿入具であって、前記スコープ部を挿通する空孔を有する長尺なカバー部材と、前記カバー部材の側部に設けられた長尺な平板状の舌圧部と、前記カバー部材を前記モニタ装置に固定する固定手段とを有し、前記舌圧部の少なくとも先端部の幅は、前記カバー部材の幅よりも大きいことを特徴とする。 The oral cavity inserter of Patent Document 1 is an oral cavity inserter that is attached to a monitor device provided with a long scope portion and inserted into the oral cavity, and is long and has a hole through which the scope portion is inserted. It has a cover member, a long flat plate-shaped tongue pressure portion provided on a side portion of the cover member, and a fixing means for fixing the cover member to the monitor device, and at least a tip portion of the tongue pressure portion. The width of the cover member is larger than the width of the cover member.
 特許文献2の口腔内撮像システムは、口腔内(intra-oral)撮像システムであって、手動で位置決め可能な口腔内画像取込(capture)デバイスと、ディスプレイと、プロセッサと、非一時的(non-transitory)メモリであって、患者のライブ画像データを前記口腔内画像取込デバイスから受信し、前記ライブ画像データを前記ディスプレイに表示し、前記非一時的メモリから前記患者の以前に格納された口腔内画像にアクセスし、前記アクセスされた以前に格納された口腔内画像に基づいて位置合わせ(alignment)マスクを生成し、前記位置合わせマスクを前記ライブ画像データに重ねて前記ディスプレイに表示し、前記患者の新たな口腔内画像を前記ライブ画像データから取り込み、前記新たな口腔内画像を前記非一時的メモリに格納することを、前記プロセッサにより実行された場合に前記システムに行わせる命令を格納した非一時的メモリとを備える。 The intraoral imaging system of Patent Document 2 is an intra-oral imaging system, which includes a manually positionable capture device, a display, a processor, and non-temporary. -transitory) A memory that receives live image data of a patient from the intraoral image capture device, displays the live image data on the display, and is previously stored from the non-temporary memory of the patient. An intraoral image is accessed, an alignment mask is generated based on the previously accessed and previously stored intraoral image, and the alignment mask is superimposed on the live image data and displayed on the display. Stores instructions that cause the system to capture a new intraoral image of the patient from the live image data and store the new intraoral image in the non-temporary memory when executed by the processor. It has a non-temporary memory.
特開2009-153923号公報JP-A-2009-153923 特開2016-140760号公報Japanese Unexamined Patent Publication No. 2016-140760
 ここで、セカンドオピニオンや医事紛争のような第三者による画像評価が必要な場合、担当医のカルテ記載のみでは精度の高い画像評価ができない可能性がある。また、評価者等の主観が介入するおそれがある。このため、画像に基づいた疾患の定量的な評価を実施することができず、評価の精度が低いという問題点がある。 Here, when image evaluation by a third party such as a second opinion or medical dispute is required, there is a possibility that highly accurate image evaluation cannot be performed only by describing the medical record of the doctor in charge. In addition, the subjectivity of the evaluator may intervene. Therefore, there is a problem that the quantitative evaluation of the disease based on the image cannot be performed and the accuracy of the evaluation is low.
 この点、特許文献1の開示技術では、撮像された映像(画像)を見ながら口腔と咽頭腔の観察を行うにすぎず、画像に基づいた疾患の定量的な評価を行うことができない。 In this regard, the disclosure technique of Patent Document 1 merely observes the oral cavity and pharyngeal cavity while observing the captured image (image), and cannot quantitatively evaluate the disease based on the image.
 また、特許文献2の開示技術では、撮像デバイスを用いて適切な一貫した位置合わせを有する一連の歯科画像を歯科専門家が取り込むのを補助するためのものにすぎず、画像に基づいた疾患の定量的な評価を行うことができない。 Further, the technique disclosed in Patent Document 2 merely assists a dental specialist to capture a series of dental images having appropriate and consistent alignment using an imaging device, and is used only to assist a dental specialist in capturing an image-based disease. It is not possible to make a quantitative evaluation.
 そこで本発明は、上述した問題に鑑みて案出されたものであり、その目的とするところは、画像に基づき、疾患の評価における精度の向上が可能となる医療システム、医療デバイス及び医療プログラムを提供することにある。 Therefore, the present invention has been devised in view of the above-mentioned problems, and an object of the present invention is to provide a medical system, a medical device, and a medical program capable of improving the accuracy in disease evaluation based on images. To provide.
 本発明に係る医療システムは、ヒト又は動物である対象に対して照射光を照射する照射手段と、前記照射光が照射された前記対象の第1画像を撮像する第1撮像手段と、前記第1画像を含む入力情報を取得する取得手段と、過去の入力情報、前記過去の入力情報に対応する参照情報、及び、前記過去の入力情報と前記参照情報との間における3段階以上の連関度、が記憶される参照データベースと、前記参照データベースを参照し、前記取得手段により取得された前記入力情報と前記参照情報との間の3段階以上の第1連関度を含む評価情報を生成する評価手段と、前記評価情報に基づいて評価結果を生成し、前記評価結果を出力する出力手段と、を備え、前記参照情報は、疾患に関する疾患情報、処方薬に関する処方薬情報及び処置に関する処置情報の少なくとも何れかを含むことを特徴とする。 The medical system according to the present invention includes an irradiation means for irradiating an object which is a human or an animal with irradiation light, a first imaging means for capturing a first image of the object irradiated with the irradiation light, and the first imaging means. An acquisition means for acquiring input information including one image, past input information, reference information corresponding to the past input information, and three or more levels of association between the past input information and the reference information. An evaluation that refers to the reference database in which, is stored, and generates evaluation information including a first degree of association between the input information and the reference information acquired by the acquisition means in three or more stages. The reference information includes means and an output means for generating an evaluation result based on the evaluation information and outputting the evaluation result, and the reference information includes disease information related to a disease, prescription drug information related to a prescription drug, and treatment information related to treatment. It is characterized by including at least one of them.
 本発明に係る医療システムは、ヒト又は動物である対象の2次元の第1画像を撮像する第1撮像手段と、前記対象の3次元の第2画像を撮像する第2撮像手段と、前記第1画像と前記第2画像とを含む入力情報を取得する取得手段と、過去の入力情報、前記過去の入力情報に対応する参照情報、及び、前記過去の入力情報と前記参照情報との間における3段階以上の連関度、が記憶される参照データベースと、前記参照データベースを参照し、前記取得手段により取得された前記入力情報と前記参照情報との間の3段階以上の第1連関度を含む評価情報を生成する評価手段と、前記評価情報に基づいて評価結果を生成し、前記評価結果を出力する出力手段と、を備え、前記参照情報は、疾患に関する疾患情報、処方薬に関する処方薬情報及び処置に関する処置情報の少なくとも何れかを含むことを特徴とする。 The medical system according to the present invention includes a first imaging means for capturing a two-dimensional first image of a human or animal object, a second imaging means for capturing a three-dimensional second image of the object, and the first image pickup means. Between the acquisition means for acquiring the input information including the one image and the second image, the past input information, the reference information corresponding to the past input information, and the past input information and the reference information. A reference database in which three or more levels of association are stored, and a first degree of association of three or more levels between the input information and the reference information acquired by the acquisition means by referring to the reference database. The reference information includes an evaluation means for generating evaluation information and an output means for generating an evaluation result based on the evaluation information and outputting the evaluation result, and the reference information includes disease information related to a disease and prescription drug information related to a prescription drug. And include at least one of the treatment information regarding the treatment.
 本発明に係る医療デバイスは、ヒト又は動物である対象に対して照射光を照射する照射部と、前記照射光が照射された前記対象の第1画像を撮像する第1撮像部と、前記第1画像を含む入力情報を取得する取得部と、過去の入力情報、前記過去の入力情報に対応する参照情報、及び、前記過去の入力情報と前記参照情報との間における3段階以上の連関度、が記憶される参照データベースと、前記参照データベースを参照し、前記取得手段により取得された前記入力情報と前記参照情報との間の第1連関度を含む評価情報を生成する評価部と、前記評価情報に基づいて評価結果を生成し、前記評価結果を出力する出力手段と、を備え、前記参照情報は、疾患に関する疾患情報、処方薬に関する処方薬情報及び処置に関する処置情報の少なくとも何れかを含むことを特徴とする。 The medical device according to the present invention includes an irradiation unit that irradiates an object that is a human or an animal with irradiation light, a first imaging unit that captures a first image of the object irradiated with the irradiation light, and the first imaging unit. The degree of association between the acquisition unit that acquires the input information including one image, the past input information, the reference information corresponding to the past input information, and the past input information and the reference information in three or more stages. A reference database in which, is stored, an evaluation unit that refers to the reference database and generates evaluation information including a first degree of association between the input information and the reference information acquired by the acquisition means, and the evaluation unit. The reference information includes at least one of disease information regarding a disease, prescription drug information regarding a prescription drug, and treatment information regarding treatment, comprising an output means for generating an evaluation result based on the evaluation information and outputting the evaluation result. It is characterized by including.
 本発明に係る医療デバイスは、ヒト又は動物である対象の2次元の第1画像を撮像する第1撮像部と、前記対象の3次元の第2画像を撮像する第2撮像部と、前記第1画像と前記第2画像とを含む入力情報を取得する取得部と、過去の入力情報、前記過去の入力情報に対応する参照情報、及び、前記過去の入力情報と前記参照情報との間における3段階以上の連関度、が記憶される参照データベースと、前記参照データベースを参照し、前記取得手段により取得された前記入力情報と前記参照情報との間の3段階以上の第1連関度を含む評価情報を生成する評価部と、前記評価情報に基づいて評価結果を生成し、前記評価結果を出力する出力手段と、を備え、前記参照情報は、疾患に関する疾患情報、処方薬に関する処方薬情報及び処置に関する処置情報の少なくとも何れかを含むことを特徴とする。 The medical device according to the present invention includes a first imaging unit that captures a two-dimensional first image of a human or animal object, a second imaging unit that captures a three-dimensional second image of the object, and the first image pickup unit. Between the acquisition unit that acquires the input information including the one image and the second image, the past input information, the reference information corresponding to the past input information, and the past input information and the reference information. A reference database in which three or more levels of association are stored, and a first degree of association of three or more levels between the input information and the reference information acquired by the acquisition means by referring to the reference database. The reference information includes an evaluation unit that generates evaluation information and an output means that generates an evaluation result based on the evaluation information and outputs the evaluation result, and the reference information includes disease information related to a disease and prescription drug information related to a prescription drug. And at least one of the treatment information regarding the treatment.
 本発明に係る医療プログラムは、ヒト又は動物である対象に対して照射光を照射する照射ステップと、前記照射光が照射された前記対象の第1画像を撮像する第1撮像ステップと、前記第1画像を含む入力情報を取得する取得ステップと、過去の入力情報、前記過去の入力情報に対応する参照情報、及び、前記過去の入力情報と前記参照情報との間における3段階以上の連関度、が記憶される参照データベースを用いて、前記参照データベースを参照し、前記取得ステップにより取得した前記入力情報と前記参照情報との間の第1連関度を含む評価情報を生成する評価ステップと、前記評価情報に基づいて評価結果を生成し、前記評価結果を出力する出力ステップと、をコンピューターに実行させ、前記参照情報は、疾患に関する疾患情報、処方薬に関する処方薬情報及び処置に関する処置情報の少なくとも何れかを含むことを特徴とする。 The medical program according to the present invention includes an irradiation step of irradiating an object which is a human or an animal with irradiation light, a first imaging step of imaging a first image of the object irradiated with the irradiation light, and the first imaging step. The acquisition step of acquiring the input information including one image, the past input information, the reference information corresponding to the past input information, and the degree of association between the past input information and the reference information in three or more stages. Using the reference database in which, is stored, the reference database is referred to, and an evaluation step of generating evaluation information including a first degree of association between the input information acquired by the acquisition step and the reference information, and A computer is made to execute an output step of generating an evaluation result based on the evaluation information and outputting the evaluation result, and the reference information is of disease information related to a disease, prescription drug information related to a prescription drug, and treatment information related to treatment. It is characterized by including at least one of them.
 本発明に係る医療プログラムは、ヒト又は動物である対象の2次元の第1画像を撮像する第1撮像ステップと、前記対象の3次元の第2画像を撮像する第2撮像ステップと、前記第1画像と前記第2画像とを含む入力情報を取得する取得ステップと、過去の入力情報、前記過去の入力情報に対応する参照情報、及び、前記過去の入力情報と前記参照情報との間における3段階以上の連関度、が記憶される参照データベースを用いて、前記参照データベースを参照し、前記取得ステップにより取得した前記入力情報と前記参照情報との間の3段階以上の第1連関度を含む評価情報を生成する評価ステップと、前記評価情報に基づいて評価結果を生成し、前記評価結果を出力する出力ステップと、をコンピューターに実行させ、前記参照情報は、疾患に関する疾患情報、処方薬に関する処方薬情報及び処置に関する処置情報の少なくとも何れかを含むことを特徴とする。 The medical program according to the present invention includes a first imaging step of capturing a two-dimensional first image of a human or animal object, a second imaging step of capturing a three-dimensional second image of the object, and the first imaging step. Between the acquisition step of acquiring the input information including the one image and the second image, the past input information, the reference information corresponding to the past input information, and the past input information and the reference information. Using a reference database in which three or more levels of association are stored, the reference database is referred to, and the first degree of association of three or more levels between the input information acquired by the acquisition step and the reference information is obtained. A computer is made to execute an evaluation step of generating evaluation information including the evaluation information and an output step of generating an evaluation result based on the evaluation information and outputting the evaluation result, and the reference information is disease information related to a disease and a prescription drug. It is characterized by including at least one of prescription drug information regarding treatment and treatment information regarding treatment.
 本発明によれば、画像に基づき、疾患の評価における精度の向上が可能となる。 According to the present invention, it is possible to improve the accuracy in disease evaluation based on images.
図1は、第1実施形態における施術支援システムの構成例を示す模式図である。FIG. 1 is a schematic view showing a configuration example of the treatment support system according to the first embodiment. 図2(a)及び図2(b)は、第1実施形態における評価結果の一例を示す模式図である。2 (a) and 2 (b) are schematic views showing an example of the evaluation result in the first embodiment. 図3は、付帯情報を含む第1画像の一例を示す模式図である。FIG. 3 is a schematic diagram showing an example of a first image including incidental information. 図4(a)は、実施形態における施術支援装置の構成の一例を示す模式図であり、図4(b)は、第1実施形態における施術支援装置の機能の一例を示す模式図である。FIG. 4A is a schematic diagram showing an example of the configuration of the treatment support device in the embodiment, and FIG. 4B is a schematic diagram showing an example of the function of the treatment support device in the first embodiment. 図5は、第1実施形態における参照データベースの一例を示す模式図である。FIG. 5 is a schematic diagram showing an example of a reference database according to the first embodiment. 図6は、第1実施形態における参照データベースの第1変形例を示す模式図である。FIG. 6 is a schematic diagram showing a first modification of the reference database according to the first embodiment. 図7は、第1実施形態における参照データベースの第2変形例を示す模式図である。FIG. 7 is a schematic diagram showing a second modification of the reference database according to the first embodiment. 図8は、第1実施形態における医療デバイスの構成の一例を示す模式図である。FIG. 8 is a schematic view showing an example of the configuration of the medical device according to the first embodiment. 図9は、第1実施形態における医療システムの動作の一例を示すフローチャートである。FIG. 9 is a flowchart showing an example of the operation of the medical system according to the first embodiment. 図10は、第2実施形態に係る医療システムを示す模式図である。FIG. 10 is a schematic diagram showing a medical system according to the second embodiment. 図11は、第3実施形態に係る医療システムを示す模式図である。FIG. 11 is a schematic diagram showing a medical system according to the third embodiment. 図12は、第4実施形態に係る医療システムを示す模式図である。FIG. 12 is a schematic diagram showing a medical system according to the fourth embodiment. 図13は、第5実施形態に係る医療システムを示す模式図である。FIG. 13 is a schematic diagram showing a medical system according to the fifth embodiment. 図14は、第6実施形態に係る医療システムを示す模式図である。FIG. 14 is a schematic diagram showing a medical system according to the sixth embodiment.
 以下、本発明の実施形態における医療システムの一例について、図面を参照しながら説明する。 Hereinafter, an example of the medical system according to the embodiment of the present invention will be described with reference to the drawings.
 (第1実施形態:医療システム100の構成)
 図1を参照して、第1実施形態における医療システム100の構成の一例について説明する。図1は、第1実施形態における医療システム100の全体構成を示すブロック図である。
(First Embodiment: Configuration of medical system 100)
An example of the configuration of the medical system 100 according to the first embodiment will be described with reference to FIG. FIG. 1 is a block diagram showing an overall configuration of the medical system 100 according to the first embodiment.
 図1に示すように、医療システム100は、評価装置1と医療デバイス2とを備える。医療デバイス2は、例えば公衆通信網4(ネットワーク)を介して、評価装置1と接続されるほか、例えばサーバ3等と接続されてもよい。医療デバイス2は、評価装置1と有線で接続されてもよい。 As shown in FIG. 1, the medical system 100 includes an evaluation device 1 and a medical device 2. The medical device 2 may be connected to the evaluation device 1 via, for example, a public communication network 4 (network), or may be connected to, for example, a server 3. The medical device 2 may be connected to the evaluation device 1 by wire.
 医療システム100は、ヒト又は動物である対象を診断する際に用いられる。評価装置1は、対象の画像を含む入力情報を入力し、入力した入力情報に基づいて評価結果を出力する。評価装置1は、例えば図2(a)に示すように、照射光が照射された対象の第1画像を含む入力情報に基づいて、評価結果を出力する。また、評価装置1は、例えば図2(b)に示すように、対象の2次元の第1画像と対象の3次元の第2画像とを含む入力情報に基づいて、評価結果を出力する。評価結果は、例えばインフルエンザ等の対象の疾患に関する疾患情報を含む。評価結果に基づき、例えばユーザは人又は動物の疾患についての診断を支援することができる。 The medical system 100 is used when diagnosing a human or animal subject. The evaluation device 1 inputs input information including the target image, and outputs an evaluation result based on the input input information. As shown in FIG. 2A, for example, the evaluation device 1 outputs an evaluation result based on the input information including the first image of the object irradiated with the irradiation light. Further, as shown in FIG. 2B, for example, the evaluation device 1 outputs the evaluation result based on the input information including the target two-dimensional first image and the target three-dimensional second image. The evaluation result includes disease information regarding a target disease such as influenza. Based on the evaluation results, for example, the user can assist in diagnosing a human or animal disease.
 <評価装置1>
 評価装置1は、評価結果を出力する際、参照データベースを参照する。参照データベースには、予め取得された過去の入力情報、過去の入力情報の評価に用いられた参照情報、及び、過去の入力情報と参照情報との間における連関度が記憶されている。評価装置1は、参照データベースを参照し、入力情報と、参照情報との間の第1連関性を算出し、第1連関性を含む評価情報に基づいて、評価結果を生成する。
<Evaluation device 1>
The evaluation device 1 refers to the reference database when outputting the evaluation result. The reference database stores the past input information acquired in advance, the reference information used for evaluating the past input information, and the degree of association between the past input information and the reference information. The evaluation device 1 refers to the reference database, calculates the first association between the input information and the reference information, and generates an evaluation result based on the evaluation information including the first association.
 入力情報は、対象の2次元の画像である第1画像を含む。入力情報は、更に、第2画像と、撮像条件情報と、カルテデータと、測定データと、音声データを含んでいてもよい。入力情報は、第1画像を含む検査情報を含む。検査情報は、更に、第2画像と、測定データと、音声データと、を含んでいてもよい。 The input information includes the first image which is the target two-dimensional image. The input information may further include a second image, imaging condition information, medical record data, measurement data, and audio data. The input information includes inspection information including the first image. The inspection information may further include a second image, measurement data, and audio data.
 第1画像は、第1撮像部22により撮像される。第1画像は、静止画であっても、動画であってもよい。第1画像は、音声データを含んでいてもよい。第1画像は、照射部21により照射された対象の2次元の画像であってもよい。第1画像は、照射部21により照射されていない対象の2次元の画像であってもよい。 The first image is captured by the first imaging unit 22. The first image may be a still image or a moving image. The first image may include audio data. The first image may be a two-dimensional image of the object irradiated by the irradiation unit 21. The first image may be a two-dimensional image of an object that has not been irradiated by the irradiation unit 21.
 第1画像は、図3に示すように、異常箇所に関する付帯情報を含んでいてもよい。付帯情報は、異常箇所を示す部位情報 と、異常箇所の変色の状況に関する変色情報と、異常箇所の形状の状況に関する形状情報を含む。付帯情報を含む第1画像は、例えば第1撮像部22により撮像された第1画像に、付帯情報を入力部15により入力して生成される。付帯情報は、第1画像上に表示されてもよい。変色情報及び形状情報は、例えば日本口腔・咽頭科学会ガイドラインに準拠した重症度スコアを含んでいてもよい。  As shown in FIG. 3, the first image may include incidental information regarding the abnormal portion. The incidental information includes part information indicating the abnormal part, discoloration information regarding the discoloration status of the abnormal portion, and shape information regarding the shape status of the abnormal portion. The first image including the incidental information is generated by inputting the incidental information into the first image captured by the first imaging unit 22, for example, by the input unit 15. Ancillary information may be displayed on the first image. The discoloration information and the shape information may include, for example, a severity score based on the guidelines of the Japanese Society of Oral and Pharyngeal Science.
 第2画像は、第2撮像部23により撮像される。第2画像は、第1画像に対応する領域における対象の3次元の画像である。第2画像は、静止画であっても、動画であってもよい。第2画像は、音声データを含んでいてもよい。なお、第2画像は、付帯情報を含んでいてもよい。 The second image is captured by the second imaging unit 23. The second image is a three-dimensional image of the object in the region corresponding to the first image. The second image may be a still image or a moving image. The second image may include audio data. The second image may include incidental information.
 撮像条件情報は、第1画像の撮像条件に関する情報と、第2画像の撮像条件に関する情報を含む。撮像条件情報は、対象までの距離、照射部21の照度、撮影範囲等に関する情報を含む。 The imaging condition information includes information on the imaging conditions of the first image and information on the imaging conditions of the second image. The imaging condition information includes information on the distance to the target, the illuminance of the irradiation unit 21, the imaging range, and the like.
 カルテデータは、疾患のカルテのデータを示す。カルテデータは、例えば、血圧、体温、身体所見、検査所見等を含む。 The medical record data shows the data of the medical record of the disease. The medical record data includes, for example, blood pressure, body temperature, physical findings, laboratory findings, and the like.
 測定データは、測定部24により測定される。測定データは、測定部24は、例えばELISA(Enzyme-Linked Immuno Sorbent Assay)法、酵素法等の生化学反応を用いて測定データを含んでいてもよい。また、測定データは、電気化学法等の生化学反応を用いて測定される測定データを含む。測定データは、分光技術により測定される測定データを含んでいてもよい。測定データは、赤外スペクトルデータを含んでいてもよい。測定データは、固有振動数を用いて測定される測定データを含んでいてもよい。また、測定データは、電気泳動を用いて測定される測定データを含んでいてもよい。測定データは、対象の温度データを含んでいてもよい。 The measurement data is measured by the measuring unit 24. The measurement unit 24 may include the measurement data by using a biochemical reaction such as an ELISA (Enzyme-Linked ImmunoSorbent Assay) method or an enzyme method. In addition, the measurement data includes measurement data measured by using a biochemical reaction such as an electrochemical method. The measurement data may include measurement data measured by spectroscopic techniques. The measurement data may include infrared spectrum data. The measurement data may include measurement data measured using the natural frequency. In addition, the measurement data may include measurement data measured using electrophoresis. The measurement data may include temperature data of interest.
 参照情報は、対象の疾患に関する疾患情報、処方薬情報、生体情報及び処置情報の少なくとも何れかを含む。参照情報は、更に広告情報を含んでいてもよい。 The reference information includes at least one of disease information, prescription drug information, biological information and treatment information related to the target disease. The reference information may further include advertising information.
 疾患情報は、例えばインフルエンザに関する情報を含む。疾患情報は、例えば口腔内粘膜の炎症、びらん、潰瘍、腫瘍、扁桃腺の誇張、白苔、コプリック斑咽頭炎、顎骨壊死、化学療法による口腔粘膜炎等の口腔内の疾患に関する情報を含む。疾患情報は、例えば貧血に関する情報を含む。疾患情報は、例えば体温に関する情報を含む。疾患情報は、例えば呼吸停止、心臓停止、深昏睡、瞳孔径、瞳孔固定、脳幹反射、対光反射、自発呼吸等に関する情報を含む。疾患情報は、例えば眼球や結膜の炎症、びらん、潰瘍、腫瘍、充血、出血、浮腫や眼球部の黄疸、眼球突出、瞳孔径等の、目の疾患に関する情報を含む。疾患情報は、鼻腔内の炎症、びらん、潰瘍、腫瘍、創部、出血等の、鼻の疾患に関する情報を含む。疾患情報は、耳や鼓膜の炎症、びらん、潰瘍、腫瘍、創部等の、耳の疾患に関する情報を含む。疾患情報は、頸部や鎖骨上窩リンパ節の腫脹、頸部の動脈瘤、頸静脈の怒張等の頸の疾患に関する情報を含む。疾患情報は、口唇の炎症、びらん、潰瘍、腫瘍、創部、熱傷、チアノーゼ等の、口唇の疾患に関する情報を含む。疾患情報は、パーキンソン病に関する情報を含む。疾患情報は、クッシング症候群に関する情報を含む。疾患情報は、乳房や乳頭の炎症、びらん、潰瘍、腫瘍、創部等の乳房や乳頭の疾患に関する情報を含む。疾患情報は、胸部の皮膚や腋窩リンパ節の炎症、びらん、潰瘍、腫瘍、創部や呼吸筋疲労等の胸部や腋窩リンパ節の疾患に関する情報を含む。疾患情報は、皮膚の炎症、びらん、潰瘍、腫瘍、腫瘤、創部、日焼け、熱傷、凍傷、壊疸、出血、シミ、皮疹、脱毛、色調等の皮膚の疾患に関する情報を含む。疾患情報は、腹部の皮膚、鼠径リンパ節の炎症、びらん、腫瘍、創部等の腹部や鼠径リンパ節の疾患に関する情報を含む。疾患情報は、泌尿生殖器(陰部、尿道口、膣内等)の炎症、びらん、潰瘍、腫瘍、創部、内痔核、外痔核、痔瘻等の泌尿生殖器の疾患に関する情報を含む。疾患情報は、下肢の浮腫、静脈瘤、壊疸、リンパ浮腫等の下肢の疾患に関する情報を含む。疾患情報は、上肢の浮腫、静脈瘤、壊疸、リンパ浮腫、手掌紅斑等の上肢や手指の疾患に関する情報を含む。疾患情報は、爪の色調変化やばち指等の指や爪の疾患に関する情報を含む。 Disease information includes, for example, information on influenza. Disease information includes information on diseases in the oral cavity such as inflammation of the oral mucosa, erosion, ulcer, tumor, exaggeration of tonsils, white moss, Koplik's spot pharyngitis, jaw bone necrosis, and oral mucositis due to chemotherapy. Disease information includes, for example, information about anemia. Disease information includes, for example, information about body temperature. Disease information includes, for example, information on respiratory arrest, cardiac arrest, deep coma, pupil diameter, pupil fixation, brainstem reflex, light reflex, spontaneous reflex, and the like. The disease information includes information on eye diseases such as inflammation of the eyeball and conjunctiva, erosion, ulcer, tumor, hyperemia, bleeding, edema and jaundice of the eyeball, eye protrusion, pupil diameter, and the like. Disease information includes information on nasal diseases such as inflammation in the nasal cavity, erosions, ulcers, tumors, wounds, bleeding, etc. Disease information includes information on ear diseases such as inflammation of the ear and eardrum, erosion, ulcer, tumor, wound, etc. Disease information includes information on cervical diseases such as swelling of the cervical and supraclavicular lymph nodes, cervical aneurysms, and jugular vein distension. Disease information includes information on lip diseases such as lip inflammation, erosions, ulcers, tumors, wounds, burns, cyanosis, etc. Disease information includes information about Parkinson's disease. Disease information includes information about Cushing's syndrome. Disease information includes information on breast and nipple diseases such as inflammation of the breast and nipple, erosion, ulcer, tumor, and wound. Disease information includes information on diseases of the chest and axillary lymph nodes such as inflammation of the skin and axillary lymph nodes of the chest, erosions, ulcers, tumors, wounds and respiratory muscle fatigue. Disease information includes information on skin diseases such as skin inflammation, erosions, ulcers, tumors, masses, wounds, sunburns, burns, frostbite, necrosis, bleeding, stains, rashes, hair loss, and tones. The disease information includes information on diseases of the abdomen and inguinal lymph nodes such as abdominal skin, inflammation of inguinal lymph nodes, erosions, tumors, and wounds. Disease information includes information on genitourinary diseases such as inflammation of the genitourinary system (pudendal region, urinary meatus, vagina, etc.), erosion, ulcer, tumor, wound, internal hemorrhoid, external hemorrhoid, anal fistula, etc. Disease information includes information on lower limb diseases such as lower limb edema, varicose veins, necrosis, lymphedema and the like. The disease information includes information on diseases of the upper limbs and fingers such as edema of the upper limbs, varicose veins, necrosis, lymphedema, and palmar erythema. The disease information includes information on changes in the color tone of the nail and diseases of fingers such as clubbing and nails.
 処方薬情報は、対象に対して処方すべき処方薬に関する情報である。処方薬情報は、例えば対象となる疾患に関わる治療薬を提案するものであり、具体的には、“インフルエンザの治療薬にタミフルが該当”という処方薬に関する情報である。この情報は、文章をディスプレイ等の出力部分109に表示するのみならず、音声により伝達することもできる。  Prescription drug information is information on prescription drugs that should be prescribed to the subject. The prescription drug information is, for example, proposing a therapeutic drug related to a target disease, and specifically, is information on a prescription drug that "Tamiflu corresponds to a therapeutic drug for influenza". This information can be transmitted not only by displaying the text on the output portion 109 of the display or the like, but also by voice.
 生体情報は、ヒト又は動物の遺伝子に関する情報、体温に関する情報、心拍数に関する情報を含む。 Biological information includes information on human or animal genes, information on body temperature, and information on heart rate.
 処置情報は、処置に関する情報である。処置情報は、例えば、うがいをする、ビタミンを摂取する、睡眠時間を確保する、という処置に関する情報である。 Treatment information is information about treatment. The treatment information is, for example, information on treatment such as gargling, taking vitamins, and securing sleep time.
 広告情報は、所定の商品又はサービスの広告に関する情報である。広告情報は、例えば、のど飴、グミ、サプリメント、うがい薬等の広告に関する情報である。 Advertising information is information related to advertising for a given product or service. The advertising information is, for example, information related to advertisements for throat lozenges, gummies, supplements, mouthwashes, and the like.
 <評価装置1の構成>
 図4(a)は、評価装置1の構成の一例を示す模式図である。評価装置1として、スマートフォン、パーソナルコンピュータ(PC)等の電子機器が用いられる。評価装置1は、例えば入力情報を取得できる任意の装置と一体に形成されてもよい。評価装置1は、筐体10と、CPU101と、ROM102と、RAM103と、記憶部104と、I/F105~107とを備える。各構成101~107は、内部バス110により接続される。
<Structure of evaluation device 1>
FIG. 4A is a schematic view showing an example of the configuration of the evaluation device 1. As the evaluation device 1, an electronic device such as a smartphone or a personal computer (PC) is used. The evaluation device 1 may be integrally formed with, for example, any device capable of acquiring input information. The evaluation device 1 includes a housing 10, a CPU 101, a ROM 102, a RAM 103, a storage unit 104, and I / F 105 to 107. Each configuration 101 to 107 is connected by an internal bus 110.
 CPU(Central Processing Unit)101は、評価装置1全体を制御する。ROM(Read Only Memory)102は、CPU101の動作コードを格納する。RAM(Random Access Memory)103は、CPU101の動作時に使用される作業領域である。記憶部104は、画像等の各種情報が記憶される。記憶部104として、例えばHDD(Hard Disk Drive)の他、SSD(solid state drive)等のデータ保存装置が用いられる。なお、例えば評価装置1は、図示しないGPU(Graphics Processing Unit)を有してもよい。GPUを有することで、通常よりも高速演算処理が可能となる。 The CPU (Central Processing Unit) 101 controls the entire evaluation device 1. The ROM (Read Only Memory) 102 stores the operation code of the CPU 101. The RAM (Random Access Memory) 103 is a work area used during the operation of the CPU 101. The storage unit 104 stores various information such as images. As the storage unit 104, for example, in addition to an HDD (Hard Disk Drive), a data storage device such as an SSD (solid state drive) is used. For example, the evaluation device 1 may have a GPU (Graphics Processing Unit) (not shown). Having a GPU enables faster arithmetic processing than usual.
 I/F105は、公衆通信網4を介してサーバ3等との各種情報の送受信を行うためのインターフェースである。I/F106は、入力部分108との情報の送受信を行うためのインターフェースである。入力部分108として、例えばキーボードが用いられ、医療システム100の利用者等は、入力部分108を介して、各種情報又は評価装置1の制御コマンド等を入力する。I/F107は、出力部分109との各種情報の送受信を行うためのインターフェースである。出力部分109は、記憶部104に保存された各種情報、又は評価装置1の処理状況等を出力する。出力部分109として、ディスプレイが用いられ、例えばタッチパネル式でもよい。 The I / F 105 is an interface for transmitting and receiving various information to and from the server 3 and the like via the public communication network 4. The I / F 106 is an interface for transmitting / receiving information to / from the input portion 108. For example, a keyboard is used as the input portion 108, and a user or the like of the medical system 100 inputs various information or a control command of the evaluation device 1 via the input portion 108. The I / F 107 is an interface for transmitting and receiving various information to and from the output portion 109. The output unit 109 outputs various information stored in the storage unit 104, the processing status of the evaluation device 1, and the like. A display is used as the output portion 109, and for example, a touch panel type may be used.
 図4(b)は、評価装置1の機能の一例を示す模式図である。評価装置1は、情報DB11と、取得部12と、評価部13と、出力部14と、入力部15とを備える。評価装置1は、例えば更新部16を備えてもよい。なお、図3(b)に示した機能は、CPU101が、RAM103を作業領域として、記憶部104等に記憶されたプログラムを実行することにより実現される。また、各構成11~16は、例えば人工知能により制御されてもよい。 FIG. 4B is a schematic diagram showing an example of the function of the evaluation device 1. The evaluation device 1 includes an information DB 11, an acquisition unit 12, an evaluation unit 13, an output unit 14, and an input unit 15. The evaluation device 1 may include, for example, an update unit 16. The function shown in FIG. 3B is realized by the CPU 101 executing a program stored in the storage unit 104 or the like using the RAM 103 as a work area. Further, each configuration 11 to 16 may be controlled by, for example, artificial intelligence.
 <情報DB11>
 情報DB11には、予め取得された過去の入力情報、及び過去の入力情報に用いられた参照情報が記憶された参照データベースが記憶される。情報DB11には、例えば入力情報、入力情報と参照情報との間の第1連関度を含む評価情報、評価情報に基づき生成された評価結果、及び評価結果を表示するフォーマット等の各種情報が記憶される。参照データベース及び各種情報は、HDDやSSD等で具現化された記憶部104に、各種情報のデータベースとして保存される。各構成12~16は、必要に応じて情報DB11に各種情報を記憶させ、又は各種情報を取出す。
<Information DB11>
The information DB 11 stores a reference database in which the past input information acquired in advance and the reference information used for the past input information are stored. The information DB 11 stores various information such as input information, evaluation information including the first degree of association between the input information and the reference information, the evaluation result generated based on the evaluation information, and a format for displaying the evaluation result. Will be done. The reference database and various information are stored as a database of various information in the storage unit 104 embodied in HDD, SSD, or the like. Each configuration 12 to 16 stores various information in the information DB 11 or retrieves various information as needed.
 参照データベースには、例えば図5に示すように、過去の入力情報と参照情報との間における3段階以上の連関度が記憶される。参照データベースは、例えば連関度を算出できるアルゴリズムで形成される。過去の入力情報及び参照情報は、複数のデータを有し、各過去のデータと各参照データとの関係は、それぞれ連関度で紐づいている。 As shown in FIG. 5, for example, the reference database stores three or more levels of association between the past input information and the reference information. The reference database is formed by, for example, an algorithm that can calculate the degree of association. The past input information and the reference information have a plurality of data, and the relationship between each past data and each reference data is linked by the degree of association.
 例えば、過去の入力情報に含まれる第1画像Aは、参照情報に含まれる疾患Aとの間の連関度「80%」を示し、参照情報に含まれる疾患Bとの間の連関度「15%」を示す。すなわち、「連関度」は、各データ間における繋がりの度合いを示しており、連関度が高いほど、各データの繋がりが強いことを示す。 For example, the first image A included in the past input information shows the degree of association "80%" with the disease A included in the reference information, and the degree of association "15" with the disease B included in the reference information. % ”. That is, the "degree of association" indicates the degree of connection between each data, and the higher the degree of association, the stronger the connection of each data.
 なお、過去の入力情報及び参照情報は、映像の形式で参照データベースに記憶されるほか、例えば数値、行列(ベクトル)、又はヒストグラム等の形式で記憶されてもよい。 Note that the past input information and reference information may be stored in the reference database in the form of video, or may be stored in the form of, for example, a numerical value, a matrix (vector), or a histogram.
 連関度は、例えば機械学習を用いて算出される。機械学習には、例えば深層学習が用いられる。特に、本実施形態における医療システム100では、畳込みニューラルネットワークを用いることが好ましい。本実施形態における医療システム100では、過去の入力情報と参照情報とをデータセットとして用いて機械学習により連関度が算出されて、参照データベースが構築されることとなる。 The degree of association is calculated using, for example, machine learning. For machine learning, for example, deep learning is used. In particular, in the medical system 100 of the present embodiment, it is preferable to use a convolutional neural network. In the medical system 100 of the present embodiment, the degree of association is calculated by machine learning using the past input information and the reference information as a data set, and the reference database is constructed.
 過去の入力情報は、例えば図6に示すように、過去の第1画像並びに過去の第2画像、過去の撮像条件情報、過去のカルテデータ及び過去の測定データのうち少なくともいずれか、の組み合わせと、参照情報との間の関係に基づいて、連関度が算出される。例えば、過去の第1画像に含まれる第1画像A及び過去の第2画像に含まれる第2画像Aの組み合わせは、疾患Aとの間の連関度「80%」を示し、疾患Bとの間の連関度「15%」を示す。この場合、過去の第1画像及び過去の第2画像をそれぞれ独立してデータを記憶させることができる。このため、後述する入力情報に基づく評価結果を生成する場合、精度の向上及び選択肢の範囲を拡大させることが可能となる。 As shown in FIG. 6, the past input information is, for example, a combination of at least one of the past first image and the past second image, the past imaging condition information, the past chart data, and the past measurement data. , The degree of association is calculated based on the relationship with the reference information. For example, the combination of the first image A included in the first image in the past and the second image A included in the second image in the past shows a degree of association with disease A of "80%" and is associated with disease B. The degree of association between them is "15%". In this case, the data of the first image in the past and the second image in the past can be stored independently. Therefore, when generating an evaluation result based on the input information described later, it is possible to improve the accuracy and expand the range of options.
 過去の入力情報は、例えば図7に示すように、過去の第1画像並びに過去の第2画像、過去の撮像条件情報、過去のカルテデータ及び過去の測定データのうち少なくともいずれか、の組み合わせと、参照情報との間の関係に基づいて、連関度が算出される。図7の例では、過去の入力情報において、過去の測定データを省略している。また、参照情報は、疾患情報、処方薬情報、処置情報及び広告情報を含む。例えば、過去の第1画像に含まれる第1画像Bと、過去のカルテデータに含まれるカルテデータAと、過去の撮影条件情報に含まれる撮影条件Aと、の組み合わせは、疾患Bとの間の連関度「50%」を示し、処置Aとの間の連関度「80%」を示し、広告Bとの間の連関度「20%」を示す。この場合、過去の入力情報をそれぞれ独立して記憶させることができる。このため、後述する入力情報に基づく評価結果を生成する場合、精度の向上及び選択肢の範囲を拡大させることが可能となる。 As shown in FIG. 7, for example, the past input information is a combination of at least one of the past first image and the past second image, the past imaging condition information, the past chart data, and the past measurement data. , The degree of association is calculated based on the relationship with the reference information. In the example of FIG. 7, the past measurement data is omitted in the past input information. In addition, the reference information includes disease information, prescription drug information, treatment information and advertising information. For example, the combination of the first image B included in the past first image, the chart data A included in the past chart data, and the imaging condition A included in the past imaging condition information is between the disease B. The degree of association with the treatment A is "50%", the degree of association with the treatment A is "80%", and the degree of association with the advertisement B is "20%". In this case, the past input information can be stored independently. Therefore, when generating an evaluation result based on the input information described later, it is possible to improve the accuracy and expand the range of options.
 <取得部12>
 取得部12は、評価の対象となる入力情報を生成し、入力情報を取得する。取得部12は、医療デバイス2により生成された入力情報を取得してもよい。医療デバイス2は、公衆通信網4や可搬メモリ等の記憶媒体を介して、入力情報を取得してもよい。なお、入力情報の形式は任意であり、例えば取得部12が任意のファイル形式に変換してもよい。また、取得部12は、医療デバイス2から検査情報を取得し、取得した検査情報を含む入力情報を生成し、生成した入力情報を取得してもよい。
<Acquisition unit 12>
The acquisition unit 12 generates input information to be evaluated and acquires the input information. The acquisition unit 12 may acquire the input information generated by the medical device 2. The medical device 2 may acquire input information via a storage medium such as a public communication network 4 or a portable memory. The format of the input information is arbitrary, and for example, the acquisition unit 12 may convert it to an arbitrary file format. Further, the acquisition unit 12 may acquire the examination information from the medical device 2, generate the input information including the acquired examination information, and acquire the generated input information.
 <評価部13>
 評価部13は、入力情報と、参照情報との間における3段階以上の第1連関度を含む評価情報を取得する。評価部13は、例えば参照データベースを参照して、入力情報と一致又は類似する過去の入力情報を選択し、選択された過去の入力情報に紐づけられた連関度を第1連関度として算出する。このほか、評価部13は、例えば参照データベースを分類器のアルゴリズムとして用い、入力情報と参照情報との間における第1連関度を算出してもよい。
<Evaluation unit 13>
The evaluation unit 13 acquires evaluation information including a first degree of association of three or more levels between the input information and the reference information. For example, the evaluation unit 13 refers to a reference database, selects past input information that matches or is similar to the input information, and calculates the degree of association associated with the selected past input information as the first degree of association. .. In addition, the evaluation unit 13 may use, for example, a reference database as an algorithm of the classifier to calculate the first degree of association between the input information and the reference information.
 例えば、図5に示した参照データベースを用いる場合、入力情報が第1画像Aと一致又は類似するとき、疾患Aに対して「80%」、疾患Bに対して「15%」、疾患Cに対して「1%」の第1連関度がそれぞれ算出される。 For example, when using the reference database shown in FIG. 5, when the input information matches or resembles the first image A, “80%” for disease A, “15%” for disease B, and disease C. On the other hand, the first degree of association of "1%" is calculated respectively.
 また、図6及び図7に示した参照データベースを用いる場合、入力情報が第1画像A及び第2画像Bと類似するときは、例えば第1画像A及び第2画像Bの組み合わせと参照データとの間の連関度が第1連関度として算出される。例えば第1画像A及び第2画像Bと参照データとの間の連関度に対して任意の係数を乗算した値が第1連関度として算出されてもよい。また、入力情報が第1画像B、測定データA及び撮像条件Aと類似するときは、例えば第1画像A、測定データA及び撮像条件Aと参照データとの間の連関度に対して任意の係数を乗算した値が第1連関度として算出される。 Further, when the reference database shown in FIGS. 6 and 7 is used, when the input information is similar to the first image A and the second image B, for example, the combination of the first image A and the second image B and the reference data are used. The degree of association between the two is calculated as the first degree of association. For example, a value obtained by multiplying the degree of association between the first image A and the second image B and the reference data by an arbitrary coefficient may be calculated as the first degree of association. When the input information is similar to the first image B, the measurement data A, and the imaging condition A, for example, it is arbitrary with respect to the degree of association between the first image A, the measurement data A, the imaging condition A, and the reference data. The value obtained by multiplying the coefficients is calculated as the first degree of association.
 評価部13は、第1連関度を算出したあと、入力情報、参照情報、及び第1連関度を含む評価情報を取得する。なお、評価部13は、例えば図5に示した参照データベースを参照して、第1連関度を算出してもよい。 After calculating the first degree of association, the evaluation unit 13 acquires the input information, the reference information, and the evaluation information including the first degree of association. The evaluation unit 13 may calculate the first degree of association by referring to the reference database shown in FIG. 5, for example.
 <出力部14>
 出力部14は、評価情報に基づき評価結果を生成し、評価結果を出力する。出力部14は、例えば評価情報の第1連関度に基づいて、入力情報に対する評価結果を生成する。また、出力部14は、例えば評価情報の加工処理等を行わずに評価結果として生成してもよい。評価結果は、入力情報、参照情報及び第1連関度を含んでいてもよい。
<Output unit 14>
The output unit 14 generates an evaluation result based on the evaluation information and outputs the evaluation result. The output unit 14 generates an evaluation result for the input information, for example, based on the first degree of association of the evaluation information. Further, the output unit 14 may be generated as an evaluation result without performing, for example, processing of evaluation information. The evaluation result may include input information, reference information, and first degree of association.
 出力部14は、生成した評価結果を出力する。出力部14は、I/F107を介して出力部分109に評価結果を出力するほか、例えばI/F105を介して任意の装置に評価結果を出力してもよい。出力部14は、収音部25により収音した音声データをテキスト化し、テキスト化した音声データを、評価結果と合わせて出力してもよい。このとき、評価結果における第1画像に、病変部となる部位が明示されるようにしてもよい。これらにより、医師等がカルテを記載する手間を省略できる。 The output unit 14 outputs the generated evaluation result. The output unit 14 may output the evaluation result to the output unit 109 via the I / F 107, or may output the evaluation result to any device via the I / F 105, for example. The output unit 14 may convert the voice data collected by the sound collecting unit 25 into text, and output the text-written voice data together with the evaluation result. At this time, the site to be the lesion may be clearly shown in the first image in the evaluation result. As a result, it is possible for doctors and the like to save the trouble of writing medical records.
 <入力部15>
 入力部15は、I/F105を介して医療デバイス2から送信された入力情報を受信するほか、例えばI/F106を介して入力部分108から入力された各種情報を受信する。そのほか、入力部15は、例えばサーバ3に記憶された入力情報を受信してもよい。入力部15は、例えば可搬メモリ等の記憶媒体を介して、入力情報等を受信してもよい。入力部15は、例えば利用者が評価結果に基づいて作成した更新用データや、連関度を更新するために用いられる学習用データ等を受信する。
<Input unit 15>
The input unit 15 receives the input information transmitted from the medical device 2 via the I / F 105, and also receives various information input from the input portion 108 via the I / F 106, for example. In addition, the input unit 15 may receive, for example, the input information stored in the server 3. The input unit 15 may receive input information or the like via a storage medium such as a portable memory. The input unit 15 receives, for example, update data created by the user based on the evaluation result, learning data used for updating the degree of association, and the like.
 <更新部16>
 更新部16は、例えば過去の入力情報と、参照情報との間の関係を新たに取得した場合には、関係を連関度に反映させる。連関度に反映させるデータとして、例えば利用者が新たに作成した入力情報と、入力情報に対応する参照情報とを含む更新データが用いられる。このほか、連関度に反映させるデータとして、例えば利用者が評価結果に基づいて作成した学習用データが用いられる。
<Update part 16>
For example, when the update unit 16 newly acquires the relationship between the past input information and the reference information, the update unit 16 reflects the relationship in the degree of association. As the data to be reflected in the degree of association, for example, updated data including input information newly created by the user and reference information corresponding to the input information is used. In addition, as data to be reflected in the degree of association, for example, learning data created by the user based on the evaluation result is used.
 例えば、入力情報に基づいて生成して出力した評価結果を、医師等が確認しながら、診察を行うとする。このとき、インフルエンザが疑われる患者の診察時に「インフルエンザですね」とつぶやいた医師の音声を、収音部25により音声データとして収音する。収音部25は、収音した音声データをテキスト化し、自然言語処理を行い、医師がつぶやいた疾患情報を抽出し参照データセットに準じた参照情報としてのラベルを生成する。これらは加工(処理された)音声データである。この組み合わせの何れかを、例えば入力部15により情報DB11等に記録する。そして、この評価結果の元になった入力情報と、音声データとを新たな入力情報とし、新たな入力情報と参照情報における疾患情報との間の関係をデータセットとして、学習させる。更新部16は、特に入力情報とラベルとしてデータセットへの新しい記録として挿入を補助する機構を備える。 For example, suppose that a doctor or the like confirms the evaluation result generated and output based on the input information while performing a medical examination. At this time, the sound collecting unit 25 collects the voice of the doctor who muttered "It's influenza" when examining a patient suspected of having influenza as voice data. The sound collecting unit 25 converts the collected voice data into text, performs natural language processing, extracts disease information tweeted by a doctor, and generates a label as reference information according to a reference data set. These are processed (processed) audio data. Any one of these combinations is recorded in the information DB 11 or the like by, for example, the input unit 15. Then, the input information based on the evaluation result and the voice data are used as new input information, and the relationship between the new input information and the disease information in the reference information is learned as a data set. The update unit 16 includes a mechanism that assists in inserting the input information and the label as a new record in the data set.
 <医療デバイス2>
 医療デバイス2は、図8に示すように、検査情報生成部20と、保持部26とを備えている。医療デバイス2は、例えば公衆通信網4等を介して、評価装置1と各種情報を送受信できる。また、医療デバイス2は、入力情報を生成し、評価装置1に送信できる。なお、医療デバイス2は、例えば評価装置1の構成を含んでいてもよい。医療デバイス2は、例えば対象の口腔内を検査する際に用いられる。医療デバイス2は、例えば、対象の前額、眼球、舌、耳、鼻腔、頸、顔、腋窩、乳房、乳頭、胸部、腹部、下肢、上肢、指、体液、泌尿器、生殖器、皮膚全身を検査する際に用いられてもよい。医療デバイス2の形状は、例えばペン型で形成される。医療デバイス2の形状は、例えば拳銃型、橋型、内視鏡型、カプセル型等で形成されてもよい。
<Medical device 2>
As shown in FIG. 8, the medical device 2 includes an inspection information generation unit 20 and a holding unit 26. The medical device 2 can send and receive various information to and from the evaluation device 1 via, for example, a public communication network 4. Further, the medical device 2 can generate input information and transmit it to the evaluation device 1. The medical device 2 may include, for example, the configuration of the evaluation device 1. The medical device 2 is used, for example, when examining the oral cavity of a target. The medical device 2 examines, for example, the forehead, eyeball, tongue, ear, nasal cavity, neck, face, axilla, breast, nipple, chest, abdomen, lower limbs, upper limbs, fingers, body fluids, urinary organs, genital organs, and whole skin. It may be used when doing so. The shape of the medical device 2 is formed, for example, in a pen shape. The shape of the medical device 2 may be formed, for example, a pistol type, a bridge type, an endoscopic type, a capsule type, or the like.
 検査情報生成部20は、照射部21と、第1撮像部22と、第2撮像部23と、測定部24と、収音部25と、を有する。検査情報生成部20は、検査情報を生成し、検査情報を含む入力情報を生成する。 The inspection information generation unit 20 includes an irradiation unit 21, a first imaging unit 22, a second imaging unit 23, a measuring unit 24, and a sound collecting unit 25. The inspection information generation unit 20 generates inspection information and generates input information including inspection information.
 照射部21は、光を照射するものであり、例えば可視光源、赤外光源のLED等の光源が用いられる。 The irradiation unit 21 irradiates light, and for example, a light source such as a visible light source or an infrared light source LED is used.
 第1撮像部22は、対象の2次元の第1画像を撮像し、評価装置1に送信する。第1撮像部22は、照射部21により照射された対象の2次元の第1画像を撮像してもよい。第1撮像部22は、対象の病変部を撮像する。 The first imaging unit 22 captures the target two-dimensional first image and transmits it to the evaluation device 1. The first imaging unit 22 may capture a two-dimensional first image of the object irradiated by the irradiation unit 21. The first imaging unit 22 images the lesion portion of the target.
 第2撮像部23は、第1画像に対応する領域の対象の3次元の第2画像を撮像し、評価装置1に送信する。第2撮像部23は、3次元の第2画像により、対象の凹凸データを取得できる。第2撮像部23は、対象の病変部を撮像する。 The second imaging unit 23 captures a three-dimensional second image of the target in the region corresponding to the first image and transmits it to the evaluation device 1. The second imaging unit 23 can acquire the unevenness data of the target by the three-dimensional second image. The second imaging unit 23 images the lesion portion of the target.
 測定部24は、所定のセンサが用いられ、対象の測定データを測定し、評価装置1に送信する。測定部24は、例えばELISA(Enzyme-Linked Immuno Sorbent Assay)法、酵素法等の生化学反応を用いて測定データを測定してもよい。この手法では、センサの表面に、酵素や抗体等が固定化され、それと検体に含まれている物質と反応すると酵素と発色剤との化学反応により発色量が変化する。その発色を光センサ、イメージセンサ、目視等により測定する。 A predetermined sensor is used in the measurement unit 24 to measure the measurement data of the target and transmit it to the evaluation device 1. The measuring unit 24 may measure the measurement data by using a biochemical reaction such as an ELISA (Enzyme-Linked ImmunoSorbent Assay) method or an enzyme method. In this method, an enzyme, an antibody, or the like is immobilized on the surface of the sensor, and when it reacts with a substance contained in the sample, the amount of color development changes due to a chemical reaction between the enzyme and the color former. The color development is measured by an optical sensor, an image sensor, visual inspection, or the like.
 また、測定部24、電気化学法等の生化学反応を用いて測定データを測定してもよい。この手法では、センサの表面に、酵素や抗体等が固定化され、それと検体に含まれている物質と反応すると酵素と発色剤との化学反応により、センサ表面で電子が移動する。その電子の移動を測定することで、物質を同定することができる。測定部24は、例えば分光技術により測定データを測定してもよい。この手法では、例えば赤外領域、近赤外領域の光源を透過させ、透過光の吸光度を測定することで、物質を同定することができる。このほか、対象の口腔内部等の赤外スペクトルデータを測定してもよい。 Further, the measurement data may be measured by using a biochemical reaction such as the measuring unit 24 or the electrochemical method. In this method, an enzyme, an antibody, or the like is immobilized on the surface of the sensor, and when it reacts with a substance contained in the sample, electrons move on the surface of the sensor due to a chemical reaction between the enzyme and the color former. A substance can be identified by measuring the movement of its electrons. The measuring unit 24 may measure the measured data by, for example, a spectroscopic technique. In this method, a substance can be identified by transmitting a light source in an infrared region or a near infrared region and measuring the absorbance of the transmitted light, for example. In addition, infrared spectrum data such as the inside of the oral cavity of the target may be measured.
 また、測定部24は、固有振動数を用いた測定データを測定してもよい。この手法では、センサ表面に特定の物質と結合するようにし、一定周期の振動を加えることにより固有振動数を測定できる。この固有振動数により微量分子を測定し、物質を同定することができる。 Further, the measuring unit 24 may measure the measurement data using the natural frequency. In this method, the natural frequency can be measured by binding to a specific substance on the sensor surface and applying vibration at a fixed cycle. A trace molecule can be measured by this natural frequency to identify a substance.
 また、測定部24は、電気泳動を用いて測定データを測定してもよい。この手法では、例えばセンサ中の流路間に電圧を印加、電気泳動を行う。これにより流路の中で分子の移動が起こり、分子量によって測定対象を特定することができる。 Further, the measuring unit 24 may measure the measurement data by using electrophoresis. In this method, for example, a voltage is applied between the flow paths in the sensor to perform electrophoresis. As a result, molecules move in the flow path, and the measurement target can be specified by the molecular weight.
 また、測定部24は、対象の温度データを測定してもよい。また、測定部24は、対象までの距離を測定してもよい。 Further, the measuring unit 24 may measure the target temperature data. Further, the measuring unit 24 may measure the distance to the target.
 保持部26は、対象を所定の状態に保持するためのものである。保持部26は、例えば対象の口腔を広げた状態に保持するための舌圧子又はマウスピースが用いられる。 The holding unit 26 is for holding the target in a predetermined state. For the holding portion 26, for example, a tongue depressor or a mouthpiece for holding the target oral cavity in an expanded state is used.
 収音部25は、対象の音声データを収音する。音声データは、検査時、診察時等の対象の音声データを収音する。収音部25は、第1撮像部22に含まれていてもよい。収音部25は、収音した音声データを、テキスト化されたデータであってもしてもよい。収音部25は、収音した音声データをテキスト化し、自然言語処理を行い、疾患情報を抽出し参照データセットに準じたラベルを生成する。 The sound collecting unit 25 collects the target voice data. As the voice data, the voice data of the target at the time of examination, medical examination, etc. is collected. The sound collecting unit 25 may be included in the first imaging unit 22. The sound collecting unit 25 may convert the collected voice data into text data. The sound collecting unit 25 converts the collected voice data into text, performs natural language processing, extracts disease information, and generates a label according to the reference data set.
 <サーバ3>
 サーバ3には、各種情報に関するデータ(データベース)が記憶されている。このデータベースには、例えば公衆通信網4を介して送られてきた情報が蓄積される。サーバ3には、例えば情報DB11と同様の情報が記憶され、公衆通信網4を介して評価装置1と各種情報の送受信が行われてもよい。サーバ3として、例えばネットワーク上のデータベースサーバが用いられてもよい。サーバ3は、上述した記憶部104や情報DB11の代わりに用いられてもよい。
<Server 3>
Data (database) related to various information is stored in the server 3. Information sent via, for example, the public communication network 4 is stored in this database. For example, the server 3 stores the same information as the information DB 11, and may send and receive various information to and from the evaluation device 1 via the public communication network 4. As the server 3, for example, a database server on the network may be used. The server 3 may be used in place of the storage unit 104 and the information DB 11 described above.
 <公衆通信網4>
 公衆通信網4(ネットワーク)は、評価装置1等が通信回路を介して接続されるインターネット網等である。公衆通信網4は、いわゆる光ファイバ通信網で構成されてもよい。また、公衆通信網4は、有線通信網には限定されず、無線通信網で実現してもよい。
<Public communication network 4>
The public communication network 4 (network) is an Internet network or the like to which the evaluation device 1 and the like are connected via a communication circuit. The public communication network 4 may be composed of a so-called optical fiber communication network. Further, the public communication network 4 is not limited to the wired communication network, and may be realized by a wireless communication network.
 (実施形態:医療システム100の動作)
 次に、本実施形態における医療システム100の動作の一例について説明する。図9は、本実施形態における医療システム100の動作の一例を示すフローチャートである。
(Embodiment: Operation of Medical System 100)
Next, an example of the operation of the medical system 100 in the present embodiment will be described. FIG. 9 is a flowchart showing an example of the operation of the medical system 100 in the present embodiment.
 <検査情報生成ステップS110>
 先ず、検査情報生成部20は、第1画像を含む検査情報を生成する(検査情報生成ステップS110)。検査情報を生成する際には、例えば以下のように行う。検査情報生成ステップS110は、照射ステップS111、第1撮像ステップS112、第2撮像ステップS113及び測定ステップS114を含む。
<Inspection information generation step S110>
First, the inspection information generation unit 20 generates inspection information including the first image (inspection information generation step S110). When generating the inspection information, for example, the following is performed. The inspection information generation step S110 includes an irradiation step S111, a first imaging step S112, a second imaging step S113, and a measurement step S114.
 <照射ステップS111>
 照射部21は、ヒト又は動物である対象に対して照射光を照射する(照射ステップS111)。このとき、保持部26により対象の口腔内を広げた状態に保持しておく。照射部21は、照射する照度を撮像条件情報として情報DB11に記憶する。検査情報生成部20は、撮像条件情報を生成する。
<Irradiation step S111>
The irradiation unit 21 irradiates an object that is a human or an animal with irradiation light (irradiation step S111). At this time, the holding portion 26 holds the target oral cavity in an expanded state. The irradiation unit 21 stores the illuminance to be irradiated in the information DB 11 as imaging condition information. The inspection information generation unit 20 generates imaging condition information.
 <第1撮像ステップS112>
 次に、第1撮像部22は、照射部21により照射された対象の口腔内の2次元の第1画像を撮像する(第1撮像ステップS112)。このとき、第1撮像部22は、音声データを取得してもよい。第1撮像部22は、第1画像を情報DB11に記憶する。医療デバイス2は、第1画像を含む検査情報を生成する。 
<First imaging step S112>
Next, the first imaging unit 22 acquires a two-dimensional first image in the oral cavity of the target irradiated by the irradiation unit 21 (first imaging step S112). At this time, the first imaging unit 22 may acquire audio data. The first imaging unit 22 stores the first image in the information DB 11. The medical device 2 generates test information including the first image.
 <第2撮像ステップS113>
 次に、第2撮像部23は、対象の口腔内の3次元の第2画像を撮像する(第2撮像ステップS113)。このとき、第2撮像部23は、音声データを取得してもよい。第2撮像部23は、第2画像を情報DB11に記憶する。医療デバイス2は、第2画像を含む検査情報を生成する。なお、照射ステップS111を行う場合、第2撮像ステップS113は省略してもよい。また、第2撮像ステップS113を行う場合には、照射ステップS111を省略してもよい。
<Second imaging step S113>
Next, the second imaging unit 23 acquires a three-dimensional second image in the target oral cavity (second imaging step S113). At this time, the second imaging unit 23 may acquire audio data. The second imaging unit 23 stores the second image in the information DB 11. The medical device 2 generates test information including a second image. When performing the irradiation step S111, the second imaging step S113 may be omitted. Further, when performing the second imaging step S113, the irradiation step S111 may be omitted.
 <測定ステップS114>
 次に、測定部24は、対象の測定データを取得する(測定ステップS114)。測定部24は、測定データを情報DB11に記憶する。医療デバイス2は、測定データを含む検査情報を生成する。測定ステップS114は、必要に応じて省略してもよい。
<Measurement step S114>
Next, the measurement unit 24 acquires the target measurement data (measurement step S114). The measurement unit 24 stores the measurement data in the information DB 11. The medical device 2 generates test information including measurement data. The measurement step S114 may be omitted if necessary.
 なお、検査情報生成ステップS110では、第1撮像ステップS112、第2撮像ステップS113及び測定ステップS114の順番は任意である。医療デバイス2は、撮像条件情報、第1画像、第2画像及び測定データを含む検査情報を含む入力情報を生成し、生成した入力情報を評価装置1に送信する。入力情報を評価装置1に送信するタイミングは、照射ステップS111、第1撮像ステップS112、第2撮像ステップS113及び測定ステップS114の各ステップ完了後に、逐次行われてもよい。 In the inspection information generation step S110, the order of the first imaging step S112, the second imaging step S113, and the measurement step S114 is arbitrary. The medical device 2 generates input information including examination information including imaging condition information, a first image, a second image, and measurement data, and transmits the generated input information to the evaluation device 1. The timing of transmitting the input information to the evaluation device 1 may be sequentially performed after the completion of each of the irradiation step S111, the first imaging step S112, the second imaging step S113, and the measurement step S114.
 なお、検査情報生成ステップS110では、検査情報生成部20は、必要に応じて、撮影条件情報、カルテデータを生成して、撮影条件情報、カルテデータを更に含む入力情報を生成してもよい。 In the inspection information generation step S110, the inspection information generation unit 20 may generate shooting condition information and medical record data as necessary to generate input information including the shooting condition information and medical record data.
 <取得ステップS120>
 次に、取得部12は、検査情報を含む入力情報を取得する(取得ステップS120)。取得部12は、入力部15を介して、医療デバイス2により生成された検査情報を取得するほか、例えば公衆通信網4や可搬メモリ等の記憶媒体を介して、入力情報を取得してもよい。また、取得部12は、入力部15を介して、医療デバイス2により生成された検査情報、撮像条件情報、カルテデータを取得するほか、例えば公衆通信網4や可搬メモリ等の記憶媒体を介して、検査情報、撮像条件情報、カルテデータを取得してもよい。なお、取得部12は、取得した入力情報等を情報DB11に記憶させてもよい。取得部12が医療デバイス2から入力情報を取得するタイミングは、任意である。なお、取得部12が取得する第1画像の撮像条件等は任意である。
<Acquisition step S120>
Next, the acquisition unit 12 acquires input information including inspection information (acquisition step S120). The acquisition unit 12 acquires the inspection information generated by the medical device 2 via the input unit 15, and may also acquire the input information via a storage medium such as a public communication network 4 or a portable memory. Good. Further, the acquisition unit 12 acquires the inspection information, the imaging condition information, and the medical record data generated by the medical device 2 via the input unit 15, and also via a storage medium such as a public communication network 4 or a portable memory. Then, inspection information, imaging condition information, and medical record data may be acquired. The acquisition unit 12 may store the acquired input information and the like in the information DB 11. The timing at which the acquisition unit 12 acquires the input information from the medical device 2 is arbitrary. The imaging conditions and the like of the first image acquired by the acquisition unit 12 are arbitrary.
 取得部12は、第1画像並びに第2画像、撮像条件情報、カルテデータ及び測定データのうち少なくともいずれか、を含む入力情報を取得してもよい。取得部12は、例えば第1画像と、第2画像と、を含む入力情報を取得してもよい。取得部12は、例えば第1画像と、撮像条件情報と、第2画像と、を含む入力情報を取得してもよい。取得部12は、例えば第1画像と、撮像条件情報と、第2画像と、測定データを含む入力情報を取得してもよい。 The acquisition unit 12 may acquire input information including at least one of the first image and the second image, imaging condition information, medical record data, and measurement data. The acquisition unit 12 may acquire input information including, for example, a first image and a second image. The acquisition unit 12 may acquire input information including, for example, a first image, imaging condition information, and a second image. The acquisition unit 12 may acquire, for example, the first image, the imaging condition information, the second image, and the input information including the measurement data.
 <評価ステップS130>
 次に、評価部13は、参照データベースを参照し、入力情報と、参照情報との間の第1連関度を含む評価情報を取得する(評価ステップS130)。評価部13は、取得部12から入力情報を取得し、情報DB11から参照データベースを取得する。
<Evaluation step S130>
Next, the evaluation unit 13 refers to the reference database and acquires the evaluation information including the first degree of association between the input information and the reference information (evaluation step S130). The evaluation unit 13 acquires the input information from the acquisition unit 12 and acquires the reference database from the information DB 11.
 評価部13は、参照データベースを参照することで、入力情報と、参照情報との間の第1連関度を算出することができる。評価部13は、例えば入力情報と一致、一部一致、又は類似する過去の第1画像を選択し、対応する連関度に基づいて第1連関度を算出するほか、例えば参照データベースを分類器のアルゴリズムとして用い、第1連関度を算出してもよい。なお、評価部13は、算出した第1連関度及び取得した評価情報を情報DB11に記憶させてもよい。 The evaluation unit 13 can calculate the first degree of association between the input information and the reference information by referring to the reference database. The evaluation unit 13 selects, for example, a past first image that matches, partially matches, or is similar to the input information, calculates the first degree of association based on the corresponding degree of association, and uses, for example, a reference database as a classifier. It may be used as an algorithm to calculate the first degree of association. The evaluation unit 13 may store the calculated first degree of association and the acquired evaluation information in the information DB 11.
 評価部13は、例えば図6に示した参照データベースを参照し、第1画像並びに第2画像、撮像条件情報及び測定データのうち少なくともいずれか、の組み合わせと、参照情報との間の第1連関度を算出してもよい。評価部13は、例えば図6に示した参照データベースを参照し、第1画像と第2画像との組み合わせと、参照情報との間の第1連関度を算出してもよい。取得部12は、例えば図6に示した参照データベースを参照し、第1画像と撮像条件情報との組み合わせと、参照情報との間の第1連関度を算出してもよい。取得部12は、例えば図6に示した参照データベースを参照し、第1画像と測定データと撮像条件情報との組み合わせと、参照情報との間の第1連関度を算出してもよい。 The evaluation unit 13 refers to, for example, the reference database shown in FIG. 6, and refers to the combination of at least one of the first image and the second image, the imaging condition information, and the measurement data, and the first association between the reference information. The degree may be calculated. The evaluation unit 13 may refer to the reference database shown in FIG. 6, for example, and calculate the first degree of association between the combination of the first image and the second image and the reference information. The acquisition unit 12 may refer to, for example, the reference database shown in FIG. 6 and calculate the combination of the first image and the imaging condition information and the first degree of association between the reference information. For example, the acquisition unit 12 may refer to the reference database shown in FIG. 6 and calculate the combination of the first image, the measurement data, the imaging condition information, and the first degree of association between the reference information.
 評価部13は、例えば図7に示した参照データベースを参照し、第1画像並びに第2画像、撮像条件情報、カルテデータ及び測定データのうち少なくともいずれか、の組み合わせと、参照情報との間の第1連関度を算出してもよい。評価部13は、例えば図7に示した参照データベースを参照し、第1画像と第2画像との組み合わせと、参照情報との間の第1連関度を算出してもよい。取得部12は、例えば図7に示した参照データベースを参照し、第1画像とカルテデータと撮像条件情報との組み合わせと、参照情報との間の第1連関度を算出してもよい。 The evaluation unit 13 refers to, for example, the reference database shown in FIG. 7, and has a combination of at least one of the first image and the second image, imaging condition information, chart data, and measurement data, and the reference information. The first degree of association may be calculated. The evaluation unit 13 may refer to the reference database shown in FIG. 7, for example, and calculate the first degree of association between the combination of the first image and the second image and the reference information. For example, the acquisition unit 12 may refer to the reference database shown in FIG. 7 and calculate the combination of the first image, the chart data, the imaging condition information, and the first degree of association between the reference information.
 <出力ステップS140>
 次に、出力部14は、評価情報に基づき評価結果を生成し、評価結果を出力する(出力ステップS140)。出力部14は、評価部13又は情報DB11から評価情報を取得し、例えば情報DB11から評価結果を表示するフォーマットを取得してもよい。出力部14は、評価情報に基づいて、テキスト形式等の所定のフォーマットを参照して評価結果を生成する。
<Output step S140>
Next, the output unit 14 generates an evaluation result based on the evaluation information and outputs the evaluation result (output step S140). The output unit 14 may acquire evaluation information from the evaluation unit 13 or the information DB 11, and may acquire, for example, a format for displaying the evaluation result from the information DB 11. The output unit 14 generates an evaluation result by referring to a predetermined format such as a text format based on the evaluation information.
 その後、出力部14は、評価結果を出力する。出力部14は、出力部分109に評価結果を出力する。例えば、入力情報に基づいて生成して出力した評価結果を、医師等が確認しながら、診察を行うとする。このとき、インフルエンザが疑われる患者の診察時に「インフルエンザですね」とつぶやいた医師の音声を、収音部25により音声データとして収音する。収音した音声データをテキスト化し、自然言語処理を行う。出力部14は、出力結果と合わせて、テキスト化した音声データを出力部分109に出力してもよい。 After that, the output unit 14 outputs the evaluation result. The output unit 14 outputs the evaluation result to the output unit 109. For example, it is assumed that a doctor or the like performs a medical examination while confirming the evaluation result generated and output based on the input information. At this time, the sound collecting unit 25 collects the voice of the doctor who muttered "It's influenza" when examining a patient suspected of having influenza as voice data. The collected voice data is converted into text and processed in natural language. The output unit 14 may output the voice data converted into text to the output unit 109 together with the output result.
 出力部14は、例えば予め設定された報知基準値と、第1連関度とを比較した結果に基づき、評価結果を出力してもよい。この場合、例えば報知基準値を「90%以上」と設定したとき、第1連関度が90%以上の場合のみ評価結果を出力する。すなわち、報知基準値は任意の閾値であり、報知基準値以上又は以下等の条件は任意に設定することができる。 The output unit 14 may output an evaluation result based on, for example, a result of comparing a preset notification reference value with the first degree of association. In this case, for example, when the notification reference value is set to "90% or more", the evaluation result is output only when the first degree of association is 90% or more. That is, the notification reference value is an arbitrary threshold value, and conditions such as above or below the notification reference value can be arbitrarily set.
 <更新ステップS150>
 その後、更新部16は、例えば過去の入力情報と、参照情報との間の関係を新たに取得した場合には、関係を連関度に反映させてもよい(更新ステップS150)。更新部16は、例えば利用者が新たに作成した更新データを取得し、連関度に反映させる。このほか、更新部16は、例えば利用者が評価結果に基づいて作成した学習用データを取得し、連関度に反映させる。更新部16は、例えば機械学習を用いて連関度の算出及び更新を行い、機械学習には、例えば畳込みニューラルネットワークが用いられる。
<Update step S150>
After that, when the update unit 16 newly acquires the relationship between the past input information and the reference information, the relationship may be reflected in the degree of association (update step S150). For example, the update unit 16 acquires update data newly created by the user and reflects it in the degree of association. In addition, the update unit 16 acquires, for example, learning data created by the user based on the evaluation result and reflects it in the degree of association. The update unit 16 calculates and updates the degree of association using machine learning, for example, and a convolutional neural network is used for machine learning, for example.
 これにより、本実施形態における医療システム100の動作が終了する。なお、上述した更新ステップS150を実施するか否かは任意である。また、本実施形態における医療プログラムとして、上記動作をコンピュータに実行させてもよい。 As a result, the operation of the medical system 100 in the present embodiment is completed. It is optional whether or not the above-mentioned update step S150 is performed. Further, as a medical program in the present embodiment, the computer may be made to execute the above operation.
 本実施形態によれば、評価部13は、参照データベースを参照し、第1画像を含む入力情報と、参照情報との間の第1連関度を含む評価情報を取得する。このため、入力情報に対して参照情報を紐づけられ、入力情報に基づいて疾患の定量的な評価結果を取得することができる。これにより、疾患の評価における精度の向上が可能となる。その結果、非専門医であっても使用が容易で汎用性の高い装置として具現化することが可能となる。 According to the present embodiment, the evaluation unit 13 refers to the reference database and acquires the evaluation information including the first degree of association between the input information including the first image and the reference information. Therefore, the reference information is associated with the input information, and the quantitative evaluation result of the disease can be obtained based on the input information. This makes it possible to improve the accuracy in disease evaluation. As a result, even a non-specialist can be embodied as a highly versatile device that is easy to use.
 また、本実施形態によれば、入力情報は、2次元の画像である第1画像及び3次元の画像である第2画像を含む。このため、第1画像又は第2画像の何れか1つを用いた場合に比べ、第1画像と第2画像とを含む入力情報に基づいて、疾患を評価することができる。これにより、疾患の評価の精度をさらに向上させることが可能となる。 Further, according to the present embodiment, the input information includes a first image which is a two-dimensional image and a second image which is a three-dimensional image. Therefore, as compared with the case where either one of the first image and the second image is used, the disease can be evaluated based on the input information including the first image and the second image. This makes it possible to further improve the accuracy of disease evaluation.
 また、本実施形態によれば、入力情報は、更に撮像条件情報を含む。このため、第1画像に加え、撮像条件を更に考慮した入力情報に基づいて、疾患を評価することができる。これにより、疾患の評価の精度をさらに向上させることが可能となる。また、撮影条件情報を揃えた、いわば標準化された画像を含む入力情報に基づいて、疾患を評価することができ、疾患の評価をより高精度に行うことができる。 Further, according to the present embodiment, the input information further includes the imaging condition information. Therefore, in addition to the first image, the disease can be evaluated based on the input information that further considers the imaging conditions. This makes it possible to further improve the accuracy of disease evaluation. In addition, the disease can be evaluated based on the input information including the standardized image in which the imaging condition information is prepared, and the disease can be evaluated with higher accuracy.
 また、本実施形態によれば、入力情報は、更に測定データを含む。このため、第1画像に加え、測定データを更に考慮した入力情報に基づいて、疾患を評価することができる。これにより、疾患の評価の精度をさらに向上させることが可能となる。 Further, according to the present embodiment, the input information further includes measurement data. Therefore, in addition to the first image, the disease can be evaluated based on the input information further considering the measurement data. This makes it possible to further improve the accuracy of disease evaluation.
 また、本実施形態によれば、入力情報は、更にカルテデータを含む。このため、第1画像に加え、カルテデータを更に考慮した入力情報に基づいて、疾患等を評価することができる。これにより、疾患等の評価の精度をさらに向上させることが可能となる。 Further, according to the present embodiment, the input information further includes medical record data. Therefore, in addition to the first image, the disease or the like can be evaluated based on the input information further considering the medical record data. This makes it possible to further improve the accuracy of evaluation of diseases and the like.
 また、本実施形態によれば、第1画像又は第2画像は、付帯情報を含む。このため、付帯情報のない第1画像又は第2画像を用いるよりも、より高い精度で疾患等を評価することができる。これにより、疾患等の評価の精度をさらに向上させることが可能となる。 Further, according to the present embodiment, the first image or the second image includes incidental information. Therefore, it is possible to evaluate the disease or the like with higher accuracy than using the first image or the second image without incidental information. This makes it possible to further improve the accuracy of evaluation of diseases and the like.
 また、本実施形態によれば、参照情報は、疾患情報を含む。このため、入力情報に対して参照情報を紐づけられ、入力情報に基づいて疾患に関する定量的な評価結果を取得することができる。これにより、疾患の評価における精度の向上が可能となる。その結果、非専門医であっても使用が容易で汎用性の高い装置として具現化することが可能となる。 Further, according to the present embodiment, the reference information includes disease information. Therefore, the reference information is associated with the input information, and the quantitative evaluation result regarding the disease can be obtained based on the input information. This makes it possible to improve the accuracy in disease evaluation. As a result, even a non-specialist can be embodied as a highly versatile device that is easy to use.
 また、本実施形態によれば、参照情報は、処方薬情報を含む。このため、入力情報に対して参照情報を紐づけられ、入力情報に基づいて対象に対して処方する処方薬に関する定量的な評価結果を取得することができる。これにより、対象に対して処方する処方薬の評価における精度の向上が可能となる。その結果、非専門医であっても使用が容易で汎用性の高い装置として具現化することが可能となる。 Further, according to the present embodiment, the reference information includes prescription drug information. Therefore, the reference information is linked to the input information, and the quantitative evaluation result of the prescription drug prescribed to the subject can be obtained based on the input information. This makes it possible to improve the accuracy in evaluating the prescription drug prescribed to the subject. As a result, even a non-specialist can be embodied as a highly versatile device that is easy to use.
 また、本実施形態によれば、参照情報は、処置情報を含む。このため、入力情報に対して参照情報を紐づけられ、入力情報に基づいて対象に対する処置に関する定量的な評価結果を取得することができる。これにより、対象に対する処置の評価における精度の向上が可能となる。その結果、非専門医であっても使用が容易で汎用性の高い装置として具現化することが可能となる。 Further, according to the present embodiment, the reference information includes the treatment information. Therefore, the reference information is associated with the input information, and the quantitative evaluation result regarding the treatment for the target can be obtained based on the input information. This makes it possible to improve the accuracy in evaluating the treatment for the subject. As a result, even a non-specialist can be embodied as a highly versatile device that is easy to use.
 また、本実施形態によれば、参照情報は、広告情報を更に含む。このため、入力情報に対して参照情報を紐づけられ、入力情報に基づいて対象に対する広告すべき商品やサービスに関して定量的な評価結果を取得することができる。これにより、対象に対して最適な商品やサービスを提案することが可能となる。 Further, according to the present embodiment, the reference information further includes advertising information. Therefore, the reference information is associated with the input information, and the quantitative evaluation result of the product or service to be advertised to the target can be obtained based on the input information. This makes it possible to propose optimal products and services to the target.
 また、本実施形態によれば、対象を所定の状態に保持する保持部26を備える。これにより、対象を一定の状態に保持した状態で、第1画像等を撮像することができる。このため、第1画像等を撮像する際の対象の負担を軽減することができる。 Further, according to the present embodiment, the holding unit 26 for holding the target in a predetermined state is provided. As a result, the first image or the like can be captured while the object is held in a constant state. Therefore, it is possible to reduce the burden on the target when capturing the first image or the like.
 また、本実施形態によれば、対象に所定の照射光を照射する照射部21を備える。これにより、第1画像等を撮像する際に、対象を照らすことができる。このため、周囲の環境によらず、所定の照度条件を満足することができる。その結果、疾患の評価をより高精度に行うことができる。 Further, according to the present embodiment, the target is provided with an irradiation unit 21 that irradiates a predetermined irradiation light. As a result, the target can be illuminated when the first image or the like is captured. Therefore, a predetermined illuminance condition can be satisfied regardless of the surrounding environment. As a result, the disease can be evaluated with higher accuracy.
 また、本実施形態によれば、照射部21は、対象の口腔内を照射する。これにより、第1画像を撮像する際に、口腔内を照らすことができる。このため、口腔内の第1画像の照度条件を満足することができ、疾患の評価をより高精度に行うことができる。特に、口腔内は多彩な粘膜所見を呈し、かつ暗視野である。このため、照射部21を備えることにより、疾患の評価をより高精度に行うことができる。 Further, according to the present embodiment, the irradiation unit 21 irradiates the target oral cavity. This makes it possible to illuminate the oral cavity when the first image is taken. Therefore, the illuminance condition of the first image in the oral cavity can be satisfied, and the disease can be evaluated with higher accuracy. In particular, the oral cavity exhibits various mucosal findings and has a dark field. Therefore, by providing the irradiation unit 21, it is possible to evaluate the disease with higher accuracy.
 また、本実施形態によれば、保持部26は、対象の口腔を広げた状態に保持するための舌圧子又はマウスピースである。このため、対象の口腔を広げた状態に保持した状態で、第1画像を撮像することができ、口腔内の第1画像等を撮像する際の対象の負担を軽減することができる。よって、咽頭反射の強い患者や小児患者であっても、より信頼性の高い画像を撮像することができる。 Further, according to the present embodiment, the holding portion 26 is a tongue depressor or a mouthpiece for holding the target oral cavity in an expanded state. Therefore, the first image can be captured while the target's oral cavity is held in an expanded state, and the burden on the target when capturing the first image in the oral cavity can be reduced. Therefore, even a patient with a strong pharyngeal reflex or a pediatric patient can take a more reliable image.
 また、本実施形態によれば、出力部14は、予め設定された報知基準値と、第1連関度とを比較した結果に基づき、評価結果を出力する。このため、報知基準値の設定条件に応じて、評価結果を出力するか否かを制御することができる。これにより、評価結果が必要な場合のみ出力させることができ、利用者等は、常に評価結果を確認する必要がない。従って、利用者等の作業負荷を低減させることが可能となる。 Further, according to the present embodiment, the output unit 14 outputs an evaluation result based on the result of comparing the preset notification reference value with the first degree of association. Therefore, it is possible to control whether or not to output the evaluation result according to the setting condition of the notification reference value. As a result, the evaluation result can be output only when it is necessary, and the user or the like does not have to check the evaluation result at all times. Therefore, it is possible to reduce the workload of the user and the like.
 また、本実施形態によれば、更新部16は、過去の入力情報と、参照情報との間の関係を新たに取得した場合には、関係を連関度に反映させる。このため、連関度を容易に更新することができ、評価の精度を向上させることが可能となる。 Further, according to the present embodiment, when the update unit 16 newly acquires the relationship between the past input information and the reference information, the update unit 16 reflects the relationship in the degree of association. Therefore, the degree of association can be easily updated, and the accuracy of evaluation can be improved.
 従来、医療分野では、患者病変部の画像解析と、血液等の検体の生化学、電気化学解析は、診察室で同時に行うことができなかった。一方、本実施形態によれば、第1撮像部22による第1画像の撮像と、第2撮像部23による第2画像の撮像と、血液等の測定データの測定と、を同時に行うことができる。これにより、測定データを測定するための血液等の検体を分析室等に搬送することなく、検体の変質を防止することができる。このため、疾患の評価の精度を高くすることができるとともに、診察室等において即時で行うことができる。 Conventionally, in the medical field, image analysis of patient lesions and biochemical and electrochemical analysis of samples such as blood could not be performed at the same time in a doctor's office. On the other hand, according to the present embodiment, the first image capturing unit 22 can capture the first image, the second imaging unit 23 can capture the second image, and the measurement data of blood or the like can be measured at the same time. .. As a result, deterioration of the sample can be prevented without transporting the sample such as blood for measuring the measurement data to the analysis room or the like. Therefore, the accuracy of disease evaluation can be improved, and it can be performed immediately in a medical examination room or the like.
 また、本実施形態によれば、第1撮像部22、第2撮像部23、測定部24等を備える。これにより、対象の2次元の第1画像、3次元の第2画像、測定データに基づいた、評価結果を生成する。評価装置1としてスマートフォン等のポータブルデバイスを用いることにより、場所と時間によらずに疾患の評価を行うことができる。 Further, according to the present embodiment, the first imaging unit 22, the second imaging unit 23, the measuring unit 24, and the like are provided. As a result, an evaluation result is generated based on the target two-dimensional first image, three-dimensional second image, and measurement data. By using a portable device such as a smartphone as the evaluation device 1, it is possible to evaluate the disease regardless of the place and time.
 また、本実施形態によれば、3段階以上に設定されている連関度(第1連関度)に基づいて、評価用映像データを評価する点に特徴がある。連関度は、例えば0~100%までの数値で記述することができるが、これに限定されるものではなく3段階以上の数値で記述できればいかなる段階で構成されていてもよい。 Further, according to the present embodiment, there is a feature that the evaluation video data is evaluated based on the degree of association (first degree of association) set in three or more stages. The degree of association can be described by a numerical value from 0 to 100%, for example, but is not limited to this, and may be configured at any stage as long as it can be described by a numerical value of 3 or more stages.
 このような連関度に基づいて、入力情報に対する評価結果の候補として選ばれる参照データにおいて、連関度の高い又は低い順に参照情報を表示することが可能となる。このように連関度の順に表示することで、利用者等は罹患している可能性の高い疾患を優先的に評価することができる。他方、罹患している可能性の低い疾患も除外せずに表示できるため、利用者等は見逃すことなく評価することが可能となる。 Based on such a degree of association, it is possible to display the reference information in descending order of the degree of association in the reference data selected as a candidate for the evaluation result for the input information. By displaying in the order of the degree of association in this way, the user or the like can preferentially evaluate the diseases that are likely to be affected. On the other hand, since it is possible to display diseases that are unlikely to be affected without excluding them, users and the like can evaluate them without overlooking them.
 上記に加え、本実施形態によれば、例えば連関度が1%のような極めて低い場合も見逃すことなく評価することができる。連関度が極めて低い参照情報であっても、僅かな兆候として繋がっていることを示しており、見逃しや誤認を抑制することが可能となる。 In addition to the above, according to the present embodiment, it is possible to evaluate without overlooking even when the degree of association is extremely low, for example, 1%. Even if the reference information has an extremely low degree of association, it is shown that the reference information is connected as a slight sign, and it is possible to suppress oversight and misunderstanding.
 (第2実施形態)
 次に、第2実施形態に係る医療システム100について、説明する。図10は、第2実施形態に係る医療システム100を示す模式図である。
(Second Embodiment)
Next, the medical system 100 according to the second embodiment will be described. FIG. 10 is a schematic view showing the medical system 100 according to the second embodiment.
 評価装置1は、例えばスマートフォン等の電子機器が用いられる。評価装置1は、第1撮像部22を含む。また、医療デバイス2は、例えば泌尿器に関する医療デバイスである。医療デバイス2は、照射部21を有する。なお、医療デバイス2は、検査情報生成部20を含んでいてもよい。 As the evaluation device 1, for example, an electronic device such as a smartphone is used. The evaluation device 1 includes a first imaging unit 22. Further, the medical device 2 is, for example, a medical device related to the urinary system. The medical device 2 has an irradiation unit 21. The medical device 2 may include an examination information generation unit 20.
 本実施形態によれば、泌尿器に関する疾患の評価をより精度よく行うことができる。 According to this embodiment, it is possible to more accurately evaluate diseases related to the urinary system.
 (第3実施形態)
 次に、第3実施形態に係る医療システム100について、説明する。図11は、第3実施形態に係る医療システム100を示す模式図である。
(Third Embodiment)
Next, the medical system 100 according to the third embodiment will be described. FIG. 11 is a schematic view showing the medical system 100 according to the third embodiment.
 評価装置1は、例えば、スマートフォン等の電子機器が用いられ、第1撮像部22を含む検査情報生成部20を有する。検査情報生成部20は、第2撮像部23、測定部24、収音部25等を含んでいてもよい。 As the evaluation device 1, for example, an electronic device such as a smartphone is used, and the evaluation device 1 has an inspection information generation unit 20 including a first imaging unit 22. The inspection information generation unit 20 may include a second imaging unit 23, a measurement unit 24, a sound collection unit 25, and the like.
 医療デバイス2は、例えば口腔に関する医療デバイスである。医療デバイス2は、舌圧子からなる保持部26を有する。医療デバイス2は、保持部26の先端部に照射部21を有する。医療デバイス2は、持手部201と、トリガー部202と、支持部203とを更に有する。なお、医療デバイス2は、検査情報生成部20を含んでいてもよい。 The medical device 2 is, for example, a medical device related to the oral cavity. The medical device 2 has a holding portion 26 made of a tongue depressor. The medical device 2 has an irradiation unit 21 at the tip of the holding unit 26. The medical device 2 further includes a handle portion 201, a trigger portion 202, and a support portion 203. The medical device 2 may include an examination information generation unit 20.
 持手部201は、医療デバイス2をユーザの手で握持するためのものであり、筒状等の所定の形状で形成される。トリガー部202は、持手部201に連結されており、ユーザの指により、引くことができる。ユーザは、トリガー部202を引くことで、検査情報生成部20、評価装置1等の各種機能を実行させる。例えば、トリガー部202を引くことで、第1撮像部22を稼働させ、第1画像を撮像する。支持部203は、持手部201に連結されており、評価装置1を所定の状態に支持するものである。支持部203は、例えば、スマートフォン等の電子機器からなる評価装置1を挟んで固定することができる。支持部203は、持手部201を挟んで反対側に、舌圧子からなる保持部26が取り付け可能である。 The handle portion 201 is for holding the medical device 2 by the user's hand, and is formed in a predetermined shape such as a tubular shape. The trigger portion 202 is connected to the handle portion 201 and can be pulled by the user's finger. By pulling the trigger unit 202, the user executes various functions such as the inspection information generation unit 20 and the evaluation device 1. For example, by pulling the trigger unit 202, the first imaging unit 22 is operated to capture the first image. The support portion 203 is connected to the handle portion 201 and supports the evaluation device 1 in a predetermined state. The support portion 203 can be fixed by sandwiching the evaluation device 1 made of an electronic device such as a smartphone, for example. A holding portion 26 made of a tongue depressor can be attached to the support portion 203 on the opposite side of the handle portion 201.
 本実施形態によれば、医療デバイス2は、ユーザが握持するための持手部201と、持手部201に連結されてユーザの指で引くことで検査情報生成部20の機能を実行させるトリガー部202と、持手部201に連結されて評価装置1を支持するための支持部203とを有する。これにより、医療デバイス2がいわゆる銃型に形成されることになり、例えば、第1撮像部22等の検査情報生成部20を機能させる際、トリガー部202を引くことで、実行させることができる。このため、ユーザが使用する際の利便性を向上させることができる。 According to the present embodiment, the medical device 2 executes the function of the inspection information generation unit 20 by being connected to the handle unit 201 for the user to hold and pulling with the user's finger. It has a trigger portion 202 and a support portion 203 that is connected to the handle portion 201 to support the evaluation device 1. As a result, the medical device 2 is formed in a so-called gun shape. For example, when the inspection information generation unit 20 such as the first imaging unit 22 functions, it can be executed by pulling the trigger unit 202. .. Therefore, it is possible to improve the convenience when the user uses it.
 (第4実施形態)
 次に、第4実施形態に係る医療システム100について、説明する。図12は、第4実施形態に係る医療システム100を示す模式図である。
(Fourth Embodiment)
Next, the medical system 100 according to the fourth embodiment will be described. FIG. 12 is a schematic view showing the medical system 100 according to the fourth embodiment.
 医療デバイス2は、例えば口腔に関する医療デバイスである。医療デバイス2は、舌圧子からなる保持部26を有する。医療デバイス2は、保持部26に照射部21を有し、いわゆる光る舌圧子として具現化されてもよい。持手部201は、先端部に保持部26が取り付けられ、第1撮像部22が保持部26に向けて配置されて取り付けられている。なお、医療デバイス2は、検査情報生成部20を含んでいてもよい。 The medical device 2 is, for example, a medical device related to the oral cavity. The medical device 2 has a holding portion 26 made of a tongue depressor. The medical device 2 may have an irradiation unit 21 in the holding unit 26 and may be embodied as a so-called glowing tongue depressor. A holding portion 26 is attached to the tip of the handle portion 201, and a first imaging portion 22 is arranged and attached toward the holding portion 26. The medical device 2 may include an examination information generation unit 20.
 (第5実施形態)
 次に、第5実施形態に係る医療システム100について、説明する。図13は、第5実施形態に係る医療システム100を示す模式図である。
(Fifth Embodiment)
Next, the medical system 100 according to the fifth embodiment will be described. FIG. 13 is a schematic view showing the medical system 100 according to the fifth embodiment.
 医療デバイス2は、例えば口腔に関する医療デバイスである。医療デバイス2は、舌圧子からなる保持部26を有する。医療デバイス2は、保持部26に照射部21を有し、いわゆる光る舌圧子として具現化されてもよい。保持部26は、両面にそれぞれ複数の第1撮像部22を有する。また、保持部26は、先端部に、第1撮像部22を有する。なお、医療デバイス2は、検査情報生成部20を含んでいてもよい。 The medical device 2 is, for example, a medical device related to the oral cavity. The medical device 2 has a holding portion 26 made of a tongue depressor. The medical device 2 may have an irradiation unit 21 in the holding unit 26 and may be embodied as a so-called glowing tongue depressor. The holding unit 26 has a plurality of first imaging units 22 on both sides thereof. Further, the holding unit 26 has a first imaging unit 22 at the tip end portion. The medical device 2 may include an examination information generation unit 20.
 本実施形態によれば、医療デバイス2は、保持部26の両面に、それぞれ複数の第1撮像部22を有する。これにより、口腔内の画像を撮像するとき、口腔内の上側と下側の画像を同時に撮像することができる。 According to the present embodiment, the medical device 2 has a plurality of first imaging units 22 on both sides of the holding unit 26. As a result, when the image of the oral cavity is captured, the images of the upper side and the lower side of the oral cavity can be captured at the same time.
 (第6実施形態)
 次に、第6実施形態に係る医療システム100について、説明する。図14は、第6実施形態に係る医療システム100を示す模式図である。
(Sixth Embodiment)
Next, the medical system 100 according to the sixth embodiment will be described. FIG. 14 is a schematic view showing the medical system 100 according to the sixth embodiment.
 医療デバイス2は、例えば皮膚に関する医療デバイスである。医療デバイス2は、皮膚Sを被覆するためのカバー部材204を有する。カバー部材204は、例えば半球の形状に形成される。そして、カバー部材204の内側に、照射部21と、第1撮像部22とを有する。なお、医療デバイス2は、検査情報生成部20を含んでいてもよい。医療デバイス2は、例えばケーブル等を介して評価装置1に接続される。 The medical device 2 is, for example, a medical device related to the skin. The medical device 2 has a cover member 204 for covering the skin S. The cover member 204 is formed, for example, in the shape of a hemisphere. The irradiation unit 21 and the first imaging unit 22 are provided inside the cover member 204. The medical device 2 may include an examination information generation unit 20. The medical device 2 is connected to the evaluation device 1 via, for example, a cable or the like.
 本実施形態によれば、医療デバイス2は、皮膚Sを被覆するためのカバー部材204を有し、カバー部材204の内側に、照射部21と、第1撮像部22とを有する。これにより、皮膚の画像を撮像する際、外乱光の影響を低減することができる。このため、疾患の評価を、より精度よく行うことができる。 According to the present embodiment, the medical device 2 has a cover member 204 for covering the skin S, and has an irradiation unit 21 and a first imaging unit 22 inside the cover member 204. This makes it possible to reduce the influence of ambient light when capturing an image of the skin. Therefore, the evaluation of the disease can be performed more accurately.
 本発明の実施形態を説明したが、各実施形態は例として提示したものであり、発明の範囲を限定することは意図していない。これら新規な実施形態は、その他の様々な形態で実施されることが可能であり、発明の要旨を逸脱しない範囲で、種々の省略、置き換え、変更を行うことができる。これら実施形態やその変形は、発明の範囲や要旨に含まれるとともに、特許請求の範囲に記載された発明とその均等の範囲に含まれる。 Although the embodiments of the present invention have been described, each embodiment is presented as an example and is not intended to limit the scope of the invention. These novel embodiments can be implemented in various other embodiments, and various omissions, replacements, and changes can be made without departing from the gist of the invention. These embodiments and modifications thereof are included in the scope and gist of the invention, and are also included in the scope of the invention described in the claims and the equivalent scope thereof.
1    :評価装置
10   :筐体
11   :情報DB
12   :取得部
13   :評価部
14   :出力部
15   :入力部
16   :更新部
2    :医療デバイス
20   :検査情報生成部
21   :照射部
22   :第1撮像部
23   :第2撮像部
24   :測定部
25   :収音部
26   :保持部
201  :持手部
202  :トリガー部
203  :支持部
204  :カバー部材
3    :サーバ
4    :公衆通信網
100  :医療システム
101  :CPU
102  :ROM
103  :RAM
104  :記憶部
105  :I/F
106  :I/F
107  :I/F
108  :入力部分
109  :出力部分
110  :内部バス
S110 :検査情報生成ステップ
S111 :照射ステップ
S112 :第1撮像ステップ
S113 :第2撮像ステップ
S114 :測定ステップ
S120 :取得ステップ
S130 :評価ステップ
S140 :出力ステップ
S150 :更新ステップ
1: Evaluation device 10: Housing 11: Information DB
12: Acquisition unit 13: Evaluation unit 14: Output unit 15: Input unit 16: Update unit 2: Medical device 20: Inspection information generation unit 21: Irradiation unit 22: First imaging unit 23: Second imaging unit 24: Measurement unit 25: Sound collecting unit 26: Holding unit 201: Handle unit 202: Trigger unit 203: Support unit 204: Cover member 3: Server 4: Public communication network 100: Medical system 101: CPU
102: ROM
103: RAM
104: Storage unit 105: I / F
106: I / F
107: I / F
108: Input part 109: Output part 110: Internal bus S110: Inspection information generation step S111: Irradiation step S112: First imaging step S113: Second imaging step S114: Measurement step S120: Acquisition step S130: Evaluation step S140: Output step S150: Update step

Claims (15)

  1.  ヒト又は動物である対象に対して照射光を照射する照射手段と、前記照射光が照射された前記対象の第1画像を撮像する第1撮像手段と、を含み、前記第1画像を含む検査情報を生成する検査情報生成手段と、
     前記検査情報を含む入力情報を取得する取得手段と、
     過去の入力情報、前記過去の入力情報に対応する参照情報、及び、前記過去の入力情報と前記参照情報との間における3段階以上の連関度、が記憶される参照データベースと、
     前記参照データベースを参照し、前記取得手段により取得された前記入力情報と前記参照情報との間の3段階以上の第1連関度を含む評価情報を生成する評価手段と、
     前記評価情報に基づいて評価結果を生成し、前記評価結果を出力する出力手段と、を備え、
     前記参照情報は、疾患に関する疾患情報、処方薬に関する処方薬情報、生体情報及び処置に関する処置情報の少なくとも何れかを含むこと
     を特徴とする医療システム。
    An inspection including an irradiation means for irradiating an object which is a human or an animal with irradiation light and a first imaging means for imaging a first image of the object irradiated with the irradiation light, and including the first image. Inspection information generation means to generate information and
    An acquisition means for acquiring input information including the inspection information, and
    A reference database that stores past input information, reference information corresponding to the past input information, and three or more levels of association between the past input information and the reference information.
    An evaluation means that refers to the reference database and generates evaluation information including a first degree of association between the input information and the reference information acquired by the acquisition means in three or more stages.
    An output means for generating an evaluation result based on the evaluation information and outputting the evaluation result is provided.
    The reference information is a medical system including at least one of disease information regarding a disease, prescription drug information regarding a prescription drug, biological information, and treatment information regarding treatment.
  2.  前記対象を所定の状態に保持するための保持手段を更に備え、
     前記第1撮像手段は、前記保持手段により所定の状態に保持された前記対象の第1画像を撮像すること
     を特徴とする請求項1記載の医療システム。
    Further provided with a holding means for holding the object in a predetermined state,
    The medical system according to claim 1, wherein the first imaging means captures a first image of the object held in a predetermined state by the holding means.
  3.  ヒト又は動物である対象の2次元の第1画像を撮像する第1撮像手段と、前記対象の3次元の第2画像を撮像する第2撮像手段と、を含み、前記第1画像と前記第2画像とを含む検査情報を生成する検査情報生成手段と、
     前記検査情報を含む入力情報を取得する取得手段と、
     過去の入力情報、前記過去の入力情報に対応する参照情報、及び、前記過去の入力情報と前記参照情報との間における3段階以上の連関度、が記憶される参照データベースと、
     前記参照データベースを参照し、前記取得手段により取得された前記入力情報と前記参照情報との間の3段階以上の第1連関度を含む評価情報を生成する評価手段と、
     前記評価情報に基づいて評価結果を生成し、前記評価結果を出力する出力手段と、を備え、
     前記参照情報は、疾患に関する疾患情報、処方薬に関する処方薬情報、生体情報及び処置に関する処置情報の少なくとも何れかを含むこと
     を特徴とする医療システム。
    The first image and the first image include a first image pickup means for capturing a two-dimensional first image of a human or animal object and a second image pickup means for capturing a three-dimensional second image of the object. Inspection information generation means for generating inspection information including two images and
    An acquisition means for acquiring input information including the inspection information, and
    A reference database that stores past input information, reference information corresponding to the past input information, and three or more levels of association between the past input information and the reference information.
    An evaluation means that refers to the reference database and generates evaluation information including a first degree of association between the input information and the reference information acquired by the acquisition means in three or more stages.
    An output means for generating an evaluation result based on the evaluation information and outputting the evaluation result is provided.
    The reference information is a medical system including at least one of disease information regarding a disease, prescription drug information regarding a prescription drug, biological information, and treatment information regarding treatment.
  4.  前記対象を所定の状態に保持するための保持手段を更に備え、
     前記検査手段は、前記対象に対して照射光を照射する照射手段を更に含み、
     前記第1撮像手段は、前記保持手段により所定の状態に保持された前記対象の第1画像を撮像すること
     を特徴とする請求項3記載の医療システム。
    Further provided with a holding means for holding the object in a predetermined state,
    The inspection means further includes an irradiation means for irradiating the object with irradiation light.
    The medical system according to claim 3, wherein the first imaging means captures a first image of the object held in a predetermined state by the holding means.
  5.  前記保持手段は、前記対象の口腔を広げた状態に保持するための舌圧子又はマウスピースであり、
     前記第1撮像手段は、前記対象の口腔内の前記第1画像を撮像すること
     を特徴とする請求項2又は4記載の医療システム。
    The holding means is a tongue depressor or a mouthpiece for holding the oral cavity of the subject in an open state.
    The medical system according to claim 2 or 4, wherein the first imaging means captures the first image in the oral cavity of the target.
  6.  前記入力情報は、前記第1撮像手段の撮像条件に関する撮像条件情報を更に含むこと
     を特徴とする請求項1~5の何れか1項記載の医療システム。
    The medical system according to any one of claims 1 to 5, wherein the input information further includes imaging condition information relating to the imaging condition of the first imaging means.
  7.  前記検査情報生成手段は、前記対象の測定データを測定するための測定手段を更に含み、
     前記入力情報は、前記測定データを更に含むこと
     を特徴とする請求項1~6の何れか1項記載の医療システム。
    The inspection information generating means further includes a measuring means for measuring the measurement data of the object.
    The medical system according to any one of claims 1 to 6, wherein the input information further includes the measurement data.
  8.  前記測定手段は、生化学反応を用いた測定、電気化学反応を用いた測定、固有振動数を用いた測定、及び電気泳動を用いた測定、の少なくともいずれかの前記測定データを測定すること
     を特徴とする請求項7記載の医療システム。
    The measuring means measures at least one of the above-mentioned measurement data, that is, measurement using a biochemical reaction, measurement using an electrochemical reaction, measurement using a natural frequency, and measurement using electrophoresis. The medical system according to claim 7, wherein the medical system is characterized.
  9.  前記過去の入力情報と、前記参照情報との間の関係を新たに取得した場合には、前記関係を前記連関度に反映させる更新手段をさらに備えること
     を特徴とする請求項1~8の何れか1項記載の医療システム。
    Any of claims 1 to 8, wherein when a relationship between the past input information and the reference information is newly acquired, an updating means for reflecting the relationship in the degree of association is further provided. The medical system described in item 1.
  10.  前記参照情報は、広告に関する広告情報を更に含むこと
     を特徴とする請求項1~9の何れか1項記載の医療システム。
    The medical system according to any one of claims 1 to 9, wherein the reference information further includes advertising information relating to an advertisement.
  11.  前記第1画像は、異常箇所に関する付帯情報を含むこと
     を特徴とする請求項1~10の何れか1項記載の医療システム。
    The medical system according to any one of claims 1 to 10, wherein the first image includes incidental information regarding an abnormal portion.
  12.  ヒト又は動物である対象に対して照射光を照射する照射部と、前記照射光が照射された前記対象の第1画像を撮像する第1撮像部と、を含み、前記第1画像を含む検査情報を生成する検査情報生成部と、
     前記検査情報を含む入力情報を取得する取得部と、
     過去の入力情報、前記過去の入力情報に対応する参照情報、及び、前記過去の入力情報と前記参照情報との間における3段階以上の連関度、が記憶される参照データベースと、
     前記参照データベースを参照し、前記取得手段により取得された前記入力情報と前記参照情報との間の第1連関度を含む評価情報を生成する評価部と、
     前記評価情報に基づいて評価結果を生成し、前記評価結果を出力する出力手段と、を備え、
     前記参照情報は、疾患に関する疾患情報、処方薬に関する処方薬情報、生体情報及び処置に関する処置情報の少なくとも何れかを含むこと
     を特徴とする医療デバイス。
    An inspection including an irradiation unit that irradiates an object that is a human or an animal with irradiation light and a first imaging unit that captures a first image of the object irradiated with the irradiation light, and includes the first image. Inspection information generator that generates information and
    An acquisition unit that acquires input information including the inspection information,
    A reference database that stores past input information, reference information corresponding to the past input information, and three or more levels of association between the past input information and the reference information.
    An evaluation unit that refers to the reference database and generates evaluation information including a first degree of association between the input information acquired by the acquisition means and the reference information.
    An output means for generating an evaluation result based on the evaluation information and outputting the evaluation result is provided.
    The reference information is a medical device including at least one of disease information regarding a disease, prescription drug information regarding a prescription drug, biological information, and treatment information regarding treatment.
  13.  ヒト又は動物である対象の2次元の第1画像を撮像する第1撮像部と、前記対象の3次元の第2画像を撮像する第2撮像部と、を含み、前記第1画像と前記第2画像とを含む検査情報を生成する検査情報生成部と、
     前記検査情報を含む入力情報を取得する取得部と、
     過去の入力情報、前記過去の入力情報に対応する参照情報、及び、前記過去の入力情報と前記参照情報との間における3段階以上の連関度、が記憶される参照データベースと、
     前記参照データベースを参照し、前記取得手段により取得された前記入力情報と前記参照情報との間の3段階以上の第1連関度を含む評価情報を生成する評価部と、
     前記評価情報に基づいて評価結果を生成し、前記評価結果を出力する出力手段と、を備え、
     前記参照情報は、疾患に関する疾患情報、処方薬に関する処方薬情報、生体情報及び処置に関する処置情報の少なくとも何れかを含むこと
     を特徴とする医療デバイス。
    The first image and the first image include a first image pickup unit that captures a two-dimensional first image of a human or animal object and a second image pickup unit that captures a three-dimensional second image of the object. An inspection information generation unit that generates inspection information including two images,
    An acquisition unit that acquires input information including the inspection information,
    A reference database that stores past input information, reference information corresponding to the past input information, and three or more levels of association between the past input information and the reference information.
    An evaluation unit that refers to the reference database and generates evaluation information including a first degree of association between the input information and the reference information acquired by the acquisition means in three or more stages.
    An output means for generating an evaluation result based on the evaluation information and outputting the evaluation result is provided.
    The reference information is a medical device including at least one of disease information regarding a disease, prescription drug information regarding a prescription drug, biological information, and treatment information regarding treatment.
  14.  ヒト又は動物である対象に対して照射光を照射する照射ステップと、前記照射光が照射された前記対象の第1画像を撮像する第1撮像ステップを含み、前記第1画像を含む検査情報を生成する検査情報生成ステップと、
     前記検査情報を含む入力情報を取得する取得ステップと、
     過去の入力情報、前記過去の入力情報に対応する参照情報、及び、前記過去の入力情報と前記参照情報との間における3段階以上の連関度、が記憶される参照データベースを用いて、前記参照データベースを参照し、前記取得ステップにより取得した前記入力情報と前記参照情報との間の第1連関度を含む評価情報を生成する評価ステップと、
     前記評価情報に基づいて評価結果を生成し、前記評価結果を出力する出力ステップと、をコンピューターに実行させ、
     前記参照情報は、疾患に関する疾患情報、処方薬に関する処方薬情報、生体情報及び処置に関する処置情報の少なくとも何れかを含むこと
     を特徴とする医療プログラム。
    Examination information including an irradiation step of irradiating an object which is a human or an animal with irradiation light and a first imaging step of imaging the first image of the object irradiated with the irradiation light, and including the first image. Inspection information generation step to be generated and
    An acquisition step for acquiring input information including the inspection information, and
    The reference using a reference database that stores past input information, reference information corresponding to the past input information, and three or more levels of association between the past input information and the reference information. An evaluation step that refers to a database and generates evaluation information including a first degree of association between the input information acquired by the acquisition step and the reference information, and an evaluation step.
    A computer is made to execute an output step of generating an evaluation result based on the evaluation information and outputting the evaluation result.
    The reference information is a medical program comprising at least one of disease information regarding a disease, prescription drug information regarding a prescription drug, biological information, and treatment information regarding treatment.
  15.  ヒト又は動物である対象の2次元の第1画像を撮像する第1撮像ステップと、前記対象の3次元の第2画像を撮像する第2撮像ステップと、を含み、前記第1画像と前記第2画像とを含む検査情報を生成する検査情報生成ステップと、
     前記検査情報を含む入力情報を取得する取得ステップと、
     過去の入力情報、前記過去の入力情報に対応する参照情報、及び、前記過去の入力情報と前記参照情報との間における3段階以上の連関度、が記憶される参照データベースを用いて、前記参照データベースを参照し、前記取得ステップにより取得した前記入力情報と前記参照情報との間の3段階以上の第1連関度を含む評価情報を生成する評価ステップと、
     前記評価情報に基づいて評価結果を生成し、前記評価結果を出力する出力ステップと、をコンピューターに実行させ、
     前記参照情報は、疾患に関する疾患情報、処方薬に関する処方薬情報、生体情報及び処置に関する処置情報の少なくとも何れかを含むこと
     を特徴とする医療プログラム。
    The first image and the first image include a first imaging step of capturing a two-dimensional first image of an object that is a human or an animal, and a second imaging step of capturing a three-dimensional second image of the object. Inspection information generation step to generate inspection information including 2 images and
    An acquisition step for acquiring input information including the inspection information, and
    The reference using a reference database that stores past input information, reference information corresponding to the past input information, and three or more levels of association between the past input information and the reference information. An evaluation step that refers to a database and generates evaluation information including a first degree of association of three or more stages between the input information acquired by the acquisition step and the reference information.
    A computer is made to execute an output step of generating an evaluation result based on the evaluation information and outputting the evaluation result.
    The reference information is a medical program comprising at least one of disease information regarding a disease, prescription drug information regarding a prescription drug, biological information, and treatment information regarding treatment.
PCT/JP2020/027992 2019-07-26 2020-07-20 Medical system, medical device and medical program WO2021020187A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2021536951A JPWO2021020187A1 (en) 2019-07-26 2020-07-20

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-137905 2019-07-26
JP2019137905 2019-07-26

Publications (1)

Publication Number Publication Date
WO2021020187A1 true WO2021020187A1 (en) 2021-02-04

Family

ID=74228297

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/027992 WO2021020187A1 (en) 2019-07-26 2020-07-20 Medical system, medical device and medical program

Country Status (2)

Country Link
JP (1) JPWO2021020187A1 (en)
WO (1) WO2021020187A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH023535B2 (en) * 1980-12-26 1990-01-24 Nippon Denshin Denwa Kk
CN106295139A (en) * 2016-07-29 2017-01-04 姹ゅ钩 A kind of tongue body autodiagnosis health cloud service system based on degree of depth convolutional neural networks
JP2018175343A (en) * 2017-04-12 2018-11-15 富士フイルム株式会社 Medical image processing apparatus, method, and program
WO2018225448A1 (en) * 2017-06-09 2018-12-13 智裕 多田 Disease diagnosis support method, diagnosis support system and diagnosis support program employing endoscopic image of digestive organ, and computer-readable recording medium having said diagnosis support program stored thereon
WO2019131327A1 (en) * 2017-12-28 2019-07-04 アイリス株式会社 Oral photographing apparatus, medical apparatus, and program
US20190216333A1 (en) * 2018-01-12 2019-07-18 Futurewei Technologies, Inc. Thermal face image use for health estimation

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN203914853U (en) * 2014-06-28 2014-11-05 吕清林 A kind of digital stomatology endoscope
CN204581215U (en) * 2015-04-21 2015-08-26 杨海蓉 Medical spatula checks auxiliary device
JP6503535B1 (en) * 2018-12-17 2019-04-17 廣美 畑中 A diagnostic method of displaying medical images on an image by symptom level at the judgment of an AI.
CN109770824A (en) * 2018-12-25 2019-05-21 天津大学 Tongue depressor for children formula larynx is as acquisition device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH023535B2 (en) * 1980-12-26 1990-01-24 Nippon Denshin Denwa Kk
CN106295139A (en) * 2016-07-29 2017-01-04 姹ゅ钩 A kind of tongue body autodiagnosis health cloud service system based on degree of depth convolutional neural networks
JP2018175343A (en) * 2017-04-12 2018-11-15 富士フイルム株式会社 Medical image processing apparatus, method, and program
WO2018225448A1 (en) * 2017-06-09 2018-12-13 智裕 多田 Disease diagnosis support method, diagnosis support system and diagnosis support program employing endoscopic image of digestive organ, and computer-readable recording medium having said diagnosis support program stored thereon
WO2019131327A1 (en) * 2017-12-28 2019-07-04 アイリス株式会社 Oral photographing apparatus, medical apparatus, and program
US20190216333A1 (en) * 2018-01-12 2019-07-18 Futurewei Technologies, Inc. Thermal face image use for health estimation

Also Published As

Publication number Publication date
JPWO2021020187A1 (en) 2021-02-04

Similar Documents

Publication Publication Date Title
US11298072B2 (en) Dermoscopy diagnosis of cancerous lesions utilizing dual deep learning algorithms via visual and audio (sonification) outputs
US20210358582A1 (en) Devices, methods, and systems for acquiring medical diagnostic information and provision of telehealth services
US11011271B2 (en) Devices, methods and systems for acquiring medical diagnostic information and provision of telehealth services
JP2021118892A (en) System, method and computer program product for physiological monitoring
US20170007126A1 (en) System for conducting a remote physical examination
US20220351859A1 (en) User interface for navigating through physiological data
Marom et al. Emerging technologies for the diagnosis of otitis media
Park et al. Optical assessment of the in vivo tympanic membrane status using a handheld optical coherence tomography-based otoscope
Huynh Smartphone-based device in exotic pet medicine
JP2006149679A (en) Method, device and program for determining degree of health
WO2021020187A1 (en) Medical system, medical device and medical program
RU2013109085A (en) METHOD AND DEVICE FOR NON-INVASIVE BLOOD GLUCOSE CONTROL
US20220000371A1 (en) System, method, and apparatus for temperature asymmetry measurement of body parts
TWM467134U (en) Pulse diagnosis analysis system
MX2023000716A (en) System and method for assisting with the diagnosis of otolaryngologic diseases from the analysis of images.
Khalili Moghaddam et al. Ex vivo biosignatures
Ferreira et al. Design of a prototype remote medical monitoring system for measuring blood pressure and glucose measurement
TW202005609A (en) Oral image analysis system and method
Llorente-Ortega et al. Introducing a new dosimeter for the assessment and monitoring of vocal risk situations and voice disorders
JPWO2021020187A5 (en)
O'Brien Investigation of cervical remodeling during pregnancy with in vivo Raman spectroscopy
EP4244869A1 (en) Clinical warning score system
KR20220124555A (en) Infrared breast examination machine

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20846456

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021536951

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 25/05/2022)

122 Ep: pct application non-entry in european phase

Ref document number: 20846456

Country of ref document: EP

Kind code of ref document: A1