WO2021020187A1 - 医療システム、医療デバイス及び医療プログラム - Google Patents

医療システム、医療デバイス及び医療プログラム Download PDF

Info

Publication number
WO2021020187A1
WO2021020187A1 PCT/JP2020/027992 JP2020027992W WO2021020187A1 WO 2021020187 A1 WO2021020187 A1 WO 2021020187A1 JP 2020027992 W JP2020027992 W JP 2020027992W WO 2021020187 A1 WO2021020187 A1 WO 2021020187A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
image
evaluation
input information
unit
Prior art date
Application number
PCT/JP2020/027992
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
知明 久慈
高呂 梅田
純平 竹下
Original Assignee
知明 久慈
高呂 梅田
純平 竹下
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 知明 久慈, 高呂 梅田, 純平 竹下 filed Critical 知明 久慈
Priority to JP2021536951A priority Critical patent/JPWO2021020187A1/ja
Publication of WO2021020187A1 publication Critical patent/WO2021020187A1/ja

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/24Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for the mouth, i.e. stomatoscopes, e.g. with tongue depressors; Instruments for opening or keeping open the mouth
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems

Definitions

  • the present invention relates to medical systems, medical devices, and medical programs to assist in the diagnosis of a human or animal subject.
  • Patent Documents 1 and 2 Conventionally, disclosure techniques of Patent Documents 1 and 2 have been proposed as techniques for observing patients using images.
  • the oral cavity inserter of Patent Document 1 is an oral cavity inserter that is attached to a monitor device provided with a long scope portion and inserted into the oral cavity, and is long and has a hole through which the scope portion is inserted. It has a cover member, a long flat plate-shaped tongue pressure portion provided on a side portion of the cover member, and a fixing means for fixing the cover member to the monitor device, and at least a tip portion of the tongue pressure portion.
  • the width of the cover member is larger than the width of the cover member.
  • the intraoral imaging system of Patent Document 2 is an intra-oral imaging system, which includes a manually positionable capture device, a display, a processor, and non-temporary. -transitory) A memory that receives live image data of a patient from the intraoral image capture device, displays the live image data on the display, and is previously stored from the non-temporary memory of the patient. An intraoral image is accessed, an alignment mask is generated based on the previously accessed and previously stored intraoral image, and the alignment mask is superimposed on the live image data and displayed on the display. Stores instructions that cause the system to capture a new intraoral image of the patient from the live image data and store the new intraoral image in the non-temporary memory when executed by the processor. It has a non-temporary memory.
  • Patent Document 1 merely observes the oral cavity and pharyngeal cavity while observing the captured image (image), and cannot quantitatively evaluate the disease based on the image.
  • Patent Document 2 merely assists a dental specialist to capture a series of dental images having appropriate and consistent alignment using an imaging device, and is used only to assist a dental specialist in capturing an image-based disease. It is not possible to make a quantitative evaluation.
  • an object of the present invention is to provide a medical system, a medical device, and a medical program capable of improving the accuracy in disease evaluation based on images. To provide.
  • the medical system includes an irradiation means for irradiating an object which is a human or an animal with irradiation light, a first imaging means for capturing a first image of the object irradiated with the irradiation light, and the first imaging means.
  • An acquisition means for acquiring input information including one image, past input information, reference information corresponding to the past input information, and three or more levels of association between the past input information and the reference information.
  • An evaluation that refers to the reference database in which, is stored, and generates evaluation information including a first degree of association between the input information and the reference information acquired by the acquisition means in three or more stages.
  • the reference information includes means and an output means for generating an evaluation result based on the evaluation information and outputting the evaluation result, and the reference information includes disease information related to a disease, prescription drug information related to a prescription drug, and treatment information related to treatment. It is characterized by including at least one of them.
  • the medical system includes a first imaging means for capturing a two-dimensional first image of a human or animal object, a second imaging means for capturing a three-dimensional second image of the object, and the first image pickup means.
  • the acquisition means for acquiring the input information including the one image and the second image, the past input information, the reference information corresponding to the past input information, and the past input information and the reference information.
  • a reference database in which three or more levels of association are stored, and a first degree of association of three or more levels between the input information and the reference information acquired by the acquisition means by referring to the reference database.
  • the reference information includes an evaluation means for generating evaluation information and an output means for generating an evaluation result based on the evaluation information and outputting the evaluation result, and the reference information includes disease information related to a disease and prescription drug information related to a prescription drug. And include at least one of the treatment information regarding the treatment.
  • the medical device includes an irradiation unit that irradiates an object that is a human or an animal with irradiation light, a first imaging unit that captures a first image of the object irradiated with the irradiation light, and the first imaging unit.
  • a reference database in which, is stored, an evaluation unit that refers to the reference database and generates evaluation information including a first degree of association between the input information and the reference information acquired by the acquisition means, and the evaluation unit.
  • the reference information includes at least one of disease information regarding a disease, prescription drug information regarding a prescription drug, and treatment information regarding treatment, comprising an output means for generating an evaluation result based on the evaluation information and outputting the evaluation result. It is characterized by including.
  • the medical device includes a first imaging unit that captures a two-dimensional first image of a human or animal object, a second imaging unit that captures a three-dimensional second image of the object, and the first image pickup unit. Between the acquisition unit that acquires the input information including the one image and the second image, the past input information, the reference information corresponding to the past input information, and the past input information and the reference information.
  • a reference database in which three or more levels of association are stored, and a first degree of association of three or more levels between the input information and the reference information acquired by the acquisition means by referring to the reference database.
  • the reference information includes an evaluation unit that generates evaluation information and an output means that generates an evaluation result based on the evaluation information and outputs the evaluation result, and the reference information includes disease information related to a disease and prescription drug information related to a prescription drug. And at least one of the treatment information regarding the treatment.
  • the medical program according to the present invention includes an irradiation step of irradiating an object which is a human or an animal with irradiation light, a first imaging step of imaging a first image of the object irradiated with the irradiation light, and the first imaging step.
  • the reference database in which, is stored, the reference database is referred to, and an evaluation step of generating evaluation information including a first degree of association between the input information acquired by the acquisition step and the reference information, and A computer is made to execute an output step of generating an evaluation result based on the evaluation information and outputting the evaluation result, and the reference information is of disease information related to a disease, prescription drug information related to a prescription drug, and treatment information related to treatment. It is characterized by including at least one of them.
  • the medical program according to the present invention includes a first imaging step of capturing a two-dimensional first image of a human or animal object, a second imaging step of capturing a three-dimensional second image of the object, and the first imaging step.
  • a first imaging step of capturing a two-dimensional first image of a human or animal object Between the acquisition step of acquiring the input information including the one image and the second image, the past input information, the reference information corresponding to the past input information, and the past input information and the reference information.
  • the reference database is referred to, and the first degree of association of three or more levels between the input information acquired by the acquisition step and the reference information is obtained.
  • a computer is made to execute an evaluation step of generating evaluation information including the evaluation information and an output step of generating an evaluation result based on the evaluation information and outputting the evaluation result, and the reference information is disease information related to a disease and a prescription drug. It is characterized by including at least one of prescription drug information regarding treatment and treatment information regarding treatment.
  • FIG. 1 is a schematic view showing a configuration example of the treatment support system according to the first embodiment.
  • 2 (a) and 2 (b) are schematic views showing an example of the evaluation result in the first embodiment.
  • FIG. 3 is a schematic diagram showing an example of a first image including incidental information.
  • FIG. 4A is a schematic diagram showing an example of the configuration of the treatment support device in the embodiment, and
  • FIG. 4B is a schematic diagram showing an example of the function of the treatment support device in the first embodiment.
  • FIG. 5 is a schematic diagram showing an example of a reference database according to the first embodiment.
  • FIG. 6 is a schematic diagram showing a first modification of the reference database according to the first embodiment.
  • FIG. 7 is a schematic diagram showing a second modification of the reference database according to the first embodiment.
  • FIG. 1 is a schematic view showing a configuration example of the treatment support system according to the first embodiment.
  • 2 (a) and 2 (b) are schematic views showing an example of the evaluation result in the first embodiment.
  • FIG. 8 is a schematic view showing an example of the configuration of the medical device according to the first embodiment.
  • FIG. 9 is a flowchart showing an example of the operation of the medical system according to the first embodiment.
  • FIG. 10 is a schematic diagram showing a medical system according to the second embodiment.
  • FIG. 11 is a schematic diagram showing a medical system according to the third embodiment.
  • FIG. 12 is a schematic diagram showing a medical system according to the fourth embodiment.
  • FIG. 13 is a schematic diagram showing a medical system according to the fifth embodiment.
  • FIG. 14 is a schematic diagram showing a medical system according to the sixth embodiment.
  • FIG. 1 is a block diagram showing an overall configuration of the medical system 100 according to the first embodiment.
  • the medical system 100 includes an evaluation device 1 and a medical device 2.
  • the medical device 2 may be connected to the evaluation device 1 via, for example, a public communication network 4 (network), or may be connected to, for example, a server 3.
  • the medical device 2 may be connected to the evaluation device 1 by wire.
  • the medical system 100 is used when diagnosing a human or animal subject.
  • the evaluation device 1 inputs input information including the target image, and outputs an evaluation result based on the input input information. As shown in FIG. 2A, for example, the evaluation device 1 outputs an evaluation result based on the input information including the first image of the object irradiated with the irradiation light. Further, as shown in FIG. 2B, for example, the evaluation device 1 outputs the evaluation result based on the input information including the target two-dimensional first image and the target three-dimensional second image.
  • the evaluation result includes disease information regarding a target disease such as influenza. Based on the evaluation results, for example, the user can assist in diagnosing a human or animal disease.
  • the evaluation device 1 refers to the reference database when outputting the evaluation result.
  • the reference database stores the past input information acquired in advance, the reference information used for evaluating the past input information, and the degree of association between the past input information and the reference information.
  • the evaluation device 1 refers to the reference database, calculates the first association between the input information and the reference information, and generates an evaluation result based on the evaluation information including the first association.
  • the input information includes the first image which is the target two-dimensional image.
  • the input information may further include a second image, imaging condition information, medical record data, measurement data, and audio data.
  • the input information includes inspection information including the first image.
  • the inspection information may further include a second image, measurement data, and audio data.
  • the first image is captured by the first imaging unit 22.
  • the first image may be a still image or a moving image.
  • the first image may include audio data.
  • the first image may be a two-dimensional image of the object irradiated by the irradiation unit 21.
  • the first image may be a two-dimensional image of an object that has not been irradiated by the irradiation unit 21.
  • the first image may include incidental information regarding the abnormal portion.
  • the incidental information includes part information indicating the abnormal part, discoloration information regarding the discoloration status of the abnormal portion, and shape information regarding the shape status of the abnormal portion.
  • the first image including the incidental information is generated by inputting the incidental information into the first image captured by the first imaging unit 22, for example, by the input unit 15.
  • Ancillary information may be displayed on the first image.
  • the discoloration information and the shape information may include, for example, a severity score based on the guidelines of the Japanese Society of Oral and Pharyngeal Science.
  • the second image is captured by the second imaging unit 23.
  • the second image is a three-dimensional image of the object in the region corresponding to the first image.
  • the second image may be a still image or a moving image.
  • the second image may include audio data.
  • the second image may include incidental information.
  • the imaging condition information includes information on the imaging conditions of the first image and information on the imaging conditions of the second image.
  • the imaging condition information includes information on the distance to the target, the illuminance of the irradiation unit 21, the imaging range, and the like.
  • the medical record data shows the data of the medical record of the disease.
  • the medical record data includes, for example, blood pressure, body temperature, physical findings, laboratory findings, and the like.
  • the measurement data is measured by the measuring unit 24.
  • the measurement unit 24 may include the measurement data by using a biochemical reaction such as an ELISA (Enzyme-Linked ImmunoSorbent Assay) method or an enzyme method.
  • the measurement data includes measurement data measured by using a biochemical reaction such as an electrochemical method.
  • the measurement data may include measurement data measured by spectroscopic techniques.
  • the measurement data may include infrared spectrum data.
  • the measurement data may include measurement data measured using the natural frequency.
  • the measurement data may include measurement data measured using electrophoresis.
  • the measurement data may include temperature data of interest.
  • the reference information includes at least one of disease information, prescription drug information, biological information and treatment information related to the target disease.
  • the reference information may further include advertising information.
  • Disease information includes, for example, information on influenza.
  • Disease information includes information on diseases in the oral cavity such as inflammation of the oral mucosa, erosion, ulcer, tumor, exaggeration of tonsils, white moss, Koplik's spot pharyngitis, jaw bone necrosis, and oral mucositis due to chemotherapy.
  • Disease information includes, for example, information about anemia.
  • Disease information includes, for example, information about body temperature.
  • Disease information includes, for example, information on respiratory arrest, cardiac arrest, deep coma, pupil diameter, pupil fixation, brainstem reflex, light reflex, spontaneous reflex, and the like.
  • the disease information includes information on eye diseases such as inflammation of the eyeball and conjunctiva, erosion, ulcer, tumor, hyperemia, bleeding, edema and jaundice of the eyeball, eye protrusion, pupil diameter, and the like.
  • Disease information includes information on nasal diseases such as inflammation in the nasal cavity, erosions, ulcers, tumors, wounds, bleeding, etc.
  • Disease information includes information on ear diseases such as inflammation of the ear and eardrum, erosion, ulcer, tumor, wound, etc.
  • Disease information includes information on cervical diseases such as swelling of the cervical and supraclavicular lymph nodes, cervical aneurysms, and jugular vein distension.
  • Disease information includes information on lip diseases such as lip inflammation, erosions, ulcers, tumors, wounds, burns, cyanosis, etc.
  • Disease information includes information about Parkinson's disease.
  • Disease information includes information about Cushing's syndrome.
  • Disease information includes information on breast and nipple diseases such as inflammation of the breast and nipple, erosion, ulcer, tumor, and wound.
  • Disease information includes information on diseases of the chest and axillary lymph nodes such as inflammation of the skin and axillary lymph nodes of the chest, erosions, ulcers, tumors, wounds and respiratory muscle fatigue.
  • Disease information includes information on skin diseases such as skin inflammation, erosions, ulcers, tumors, masses, wounds, sunburns, burns, frostbite, necrosis, bleeding, stains, rashes, hair loss, and tones.
  • the disease information includes information on diseases of the abdomen and inguinal lymph nodes such as abdominal skin, inflammation of inguinal lymph nodes, erosions, tumors, and wounds.
  • Disease information includes information on genitourinary diseases such as inflammation of the genitourinary system (pudendal region, urinary meatus, vagina, etc.), erosion, ulcer, tumor, wound, internal hemorrhoid, external hemorrhoid, anal fistula, etc.
  • Disease information includes information on lower limb diseases such as lower limb edema, varicose veins, necrosis, lymphedema and the like.
  • the disease information includes information on diseases of the upper limbs and fingers such as edema of the upper limbs, varicose veins, necrosis, lymphedema, and palmar erythema.
  • the disease information includes information on changes in the color tone of the nail and diseases of fingers such as clubbing and nails.
  • Prescription drug information is information on prescription drugs that should be prescribed to the subject.
  • the prescription drug information is, for example, proposing a therapeutic drug related to a target disease, and specifically, is information on a prescription drug that "Tamiflu corresponds to a therapeutic drug for influenza". This information can be transmitted not only by displaying the text on the output portion 109 of the display or the like, but also by voice.
  • Biological information includes information on human or animal genes, information on body temperature, and information on heart rate.
  • Treatment information is information about treatment.
  • the treatment information is, for example, information on treatment such as gargling, taking vitamins, and securing sleep time.
  • Advertising information is information related to advertising for a given product or service.
  • the advertising information is, for example, information related to advertisements for throat lozenges, gummies, supplements, mouthwashes, and the like.
  • FIG. 4A is a schematic view showing an example of the configuration of the evaluation device 1.
  • an electronic device such as a smartphone or a personal computer (PC) is used.
  • the evaluation device 1 may be integrally formed with, for example, any device capable of acquiring input information.
  • the evaluation device 1 includes a housing 10, a CPU 101, a ROM 102, a RAM 103, a storage unit 104, and I / F 105 to 107. Each configuration 101 to 107 is connected by an internal bus 110.
  • the CPU (Central Processing Unit) 101 controls the entire evaluation device 1.
  • the ROM (Read Only Memory) 102 stores the operation code of the CPU 101.
  • the RAM (Random Access Memory) 103 is a work area used during the operation of the CPU 101.
  • the storage unit 104 stores various information such as images.
  • As the storage unit 104 for example, in addition to an HDD (Hard Disk Drive), a data storage device such as an SSD (solid state drive) is used.
  • the evaluation device 1 may have a GPU (Graphics Processing Unit) (not shown). Having a GPU enables faster arithmetic processing than usual.
  • the I / F 105 is an interface for transmitting and receiving various information to and from the server 3 and the like via the public communication network 4.
  • the I / F 106 is an interface for transmitting / receiving information to / from the input portion 108.
  • a keyboard is used as the input portion 108, and a user or the like of the medical system 100 inputs various information or a control command of the evaluation device 1 via the input portion 108.
  • the I / F 107 is an interface for transmitting and receiving various information to and from the output portion 109.
  • the output unit 109 outputs various information stored in the storage unit 104, the processing status of the evaluation device 1, and the like.
  • a display is used as the output portion 109, and for example, a touch panel type may be used.
  • FIG. 4B is a schematic diagram showing an example of the function of the evaluation device 1.
  • the evaluation device 1 includes an information DB 11, an acquisition unit 12, an evaluation unit 13, an output unit 14, and an input unit 15.
  • the evaluation device 1 may include, for example, an update unit 16.
  • the function shown in FIG. 3B is realized by the CPU 101 executing a program stored in the storage unit 104 or the like using the RAM 103 as a work area. Further, each configuration 11 to 16 may be controlled by, for example, artificial intelligence.
  • the information DB 11 stores a reference database in which the past input information acquired in advance and the reference information used for the past input information are stored.
  • the information DB 11 stores various information such as input information, evaluation information including the first degree of association between the input information and the reference information, the evaluation result generated based on the evaluation information, and a format for displaying the evaluation result. Will be done.
  • the reference database and various information are stored as a database of various information in the storage unit 104 embodied in HDD, SSD, or the like.
  • Each configuration 12 to 16 stores various information in the information DB 11 or retrieves various information as needed.
  • the reference database stores three or more levels of association between the past input information and the reference information.
  • the reference database is formed by, for example, an algorithm that can calculate the degree of association.
  • the past input information and the reference information have a plurality of data, and the relationship between each past data and each reference data is linked by the degree of association.
  • the first image A included in the past input information shows the degree of association "80%” with the disease A included in the reference information, and the degree of association "15” with the disease B included in the reference information. % ”. That is, the "degree of association" indicates the degree of connection between each data, and the higher the degree of association, the stronger the connection of each data.
  • past input information and reference information may be stored in the reference database in the form of video, or may be stored in the form of, for example, a numerical value, a matrix (vector), or a histogram.
  • the degree of association is calculated using, for example, machine learning.
  • machine learning for example, deep learning is used.
  • the degree of association is calculated by machine learning using the past input information and the reference information as a data set, and the reference database is constructed.
  • the past input information is, for example, a combination of at least one of the past first image and the past second image, the past imaging condition information, the past chart data, and the past measurement data.
  • the degree of association is calculated based on the relationship with the reference information. For example, the combination of the first image A included in the first image in the past and the second image A included in the second image in the past shows a degree of association with disease A of "80%" and is associated with disease B. The degree of association between them is "15%".
  • the data of the first image in the past and the second image in the past can be stored independently. Therefore, when generating an evaluation result based on the input information described later, it is possible to improve the accuracy and expand the range of options.
  • the past input information is a combination of at least one of the past first image and the past second image, the past imaging condition information, the past chart data, and the past measurement data.
  • the degree of association is calculated based on the relationship with the reference information.
  • the past measurement data is omitted in the past input information.
  • the reference information includes disease information, prescription drug information, treatment information and advertising information.
  • the combination of the first image B included in the past first image, the chart data A included in the past chart data, and the imaging condition A included in the past imaging condition information is between the disease B.
  • the degree of association with the treatment A is "50%”
  • the degree of association with the treatment A is "80%”
  • the degree of association with the advertisement B is "20%”.
  • the past input information can be stored independently. Therefore, when generating an evaluation result based on the input information described later, it is possible to improve the accuracy and expand the range of options.
  • the acquisition unit 12 generates input information to be evaluated and acquires the input information.
  • the acquisition unit 12 may acquire the input information generated by the medical device 2.
  • the medical device 2 may acquire input information via a storage medium such as a public communication network 4 or a portable memory.
  • the format of the input information is arbitrary, and for example, the acquisition unit 12 may convert it to an arbitrary file format.
  • the acquisition unit 12 may acquire the examination information from the medical device 2, generate the input information including the acquired examination information, and acquire the generated input information.
  • the evaluation unit 13 acquires evaluation information including a first degree of association of three or more levels between the input information and the reference information.
  • the evaluation unit 13 refers to a reference database, selects past input information that matches or is similar to the input information, and calculates the degree of association associated with the selected past input information as the first degree of association. ..
  • the evaluation unit 13 may use, for example, a reference database as an algorithm of the classifier to calculate the first degree of association between the input information and the reference information.
  • the degree of association between the two is calculated as the first degree of association.
  • a value obtained by multiplying the degree of association between the first image A and the second image B and the reference data by an arbitrary coefficient may be calculated as the first degree of association.
  • the evaluation unit 13 After calculating the first degree of association, acquires the input information, the reference information, and the evaluation information including the first degree of association.
  • the evaluation unit 13 may calculate the first degree of association by referring to the reference database shown in FIG. 5, for example.
  • the output unit 14 generates an evaluation result based on the evaluation information and outputs the evaluation result.
  • the output unit 14 generates an evaluation result for the input information, for example, based on the first degree of association of the evaluation information. Further, the output unit 14 may be generated as an evaluation result without performing, for example, processing of evaluation information.
  • the evaluation result may include input information, reference information, and first degree of association.
  • the output unit 14 outputs the generated evaluation result.
  • the output unit 14 may output the evaluation result to the output unit 109 via the I / F 107, or may output the evaluation result to any device via the I / F 105, for example.
  • the output unit 14 may convert the voice data collected by the sound collecting unit 25 into text, and output the text-written voice data together with the evaluation result.
  • the site to be the lesion may be clearly shown in the first image in the evaluation result. As a result, it is possible for doctors and the like to save the trouble of writing medical records.
  • the input unit 15 receives the input information transmitted from the medical device 2 via the I / F 105, and also receives various information input from the input portion 108 via the I / F 106, for example.
  • the input unit 15 may receive, for example, the input information stored in the server 3.
  • the input unit 15 may receive input information or the like via a storage medium such as a portable memory.
  • the input unit 15 receives, for example, update data created by the user based on the evaluation result, learning data used for updating the degree of association, and the like.
  • Update part 16 For example, when the update unit 16 newly acquires the relationship between the past input information and the reference information, the update unit 16 reflects the relationship in the degree of association.
  • the data to be reflected in the degree of association for example, updated data including input information newly created by the user and reference information corresponding to the input information is used.
  • learning data created by the user based on the evaluation result is used.
  • the sound collecting unit 25 collects the voice of the doctor who muttered "It's influenza" when examining a patient suspected of having influenza as voice data.
  • the sound collecting unit 25 converts the collected voice data into text, performs natural language processing, extracts disease information tweeted by a doctor, and generates a label as reference information according to a reference data set. These are processed (processed) audio data. Any one of these combinations is recorded in the information DB 11 or the like by, for example, the input unit 15.
  • the update unit 16 includes a mechanism that assists in inserting the input information and the label as a new record in the data set.
  • the medical device 2 includes an inspection information generation unit 20 and a holding unit 26.
  • the medical device 2 can send and receive various information to and from the evaluation device 1 via, for example, a public communication network 4. Further, the medical device 2 can generate input information and transmit it to the evaluation device 1.
  • the medical device 2 may include, for example, the configuration of the evaluation device 1.
  • the medical device 2 is used, for example, when examining the oral cavity of a target. The medical device 2 examines, for example, the forehead, eyeball, tongue, ear, nasal cavity, neck, face, axilla, breast, nipple, chest, abdomen, lower limbs, upper limbs, fingers, body fluids, urinary organs, genital organs, and whole skin.
  • the shape of the medical device 2 is formed, for example, in a pen shape.
  • the shape of the medical device 2 may be formed, for example, a pistol type, a bridge type, an endoscopic type, a capsule type, or the like.
  • the inspection information generation unit 20 includes an irradiation unit 21, a first imaging unit 22, a second imaging unit 23, a measuring unit 24, and a sound collecting unit 25.
  • the inspection information generation unit 20 generates inspection information and generates input information including inspection information.
  • the irradiation unit 21 irradiates light, and for example, a light source such as a visible light source or an infrared light source LED is used.
  • the first imaging unit 22 captures the target two-dimensional first image and transmits it to the evaluation device 1.
  • the first imaging unit 22 may capture a two-dimensional first image of the object irradiated by the irradiation unit 21.
  • the first imaging unit 22 images the lesion portion of the target.
  • the second imaging unit 23 captures a three-dimensional second image of the target in the region corresponding to the first image and transmits it to the evaluation device 1.
  • the second imaging unit 23 can acquire the unevenness data of the target by the three-dimensional second image.
  • the second imaging unit 23 images the lesion portion of the target.
  • a predetermined sensor is used in the measurement unit 24 to measure the measurement data of the target and transmit it to the evaluation device 1.
  • the measuring unit 24 may measure the measurement data by using a biochemical reaction such as an ELISA (Enzyme-Linked ImmunoSorbent Assay) method or an enzyme method.
  • a biochemical reaction such as an ELISA (Enzyme-Linked ImmunoSorbent Assay) method or an enzyme method.
  • an enzyme, an antibody, or the like is immobilized on the surface of the sensor, and when it reacts with a substance contained in the sample, the amount of color development changes due to a chemical reaction between the enzyme and the color former.
  • the color development is measured by an optical sensor, an image sensor, visual inspection, or the like.
  • the measurement data may be measured by using a biochemical reaction such as the measuring unit 24 or the electrochemical method.
  • a biochemical reaction such as the measuring unit 24 or the electrochemical method.
  • an enzyme, an antibody, or the like is immobilized on the surface of the sensor, and when it reacts with a substance contained in the sample, electrons move on the surface of the sensor due to a chemical reaction between the enzyme and the color former.
  • a substance can be identified by measuring the movement of its electrons.
  • the measuring unit 24 may measure the measured data by, for example, a spectroscopic technique.
  • a substance can be identified by transmitting a light source in an infrared region or a near infrared region and measuring the absorbance of the transmitted light, for example.
  • infrared spectrum data such as the inside of the oral cavity of the target may be measured.
  • the measuring unit 24 may measure the measurement data using the natural frequency.
  • the natural frequency can be measured by binding to a specific substance on the sensor surface and applying vibration at a fixed cycle.
  • a trace molecule can be measured by this natural frequency to identify a substance.
  • the measuring unit 24 may measure the measurement data by using electrophoresis.
  • a voltage is applied between the flow paths in the sensor to perform electrophoresis.
  • molecules move in the flow path, and the measurement target can be specified by the molecular weight.
  • the measuring unit 24 may measure the target temperature data. Further, the measuring unit 24 may measure the distance to the target.
  • the holding unit 26 is for holding the target in a predetermined state.
  • a tongue depressor or a mouthpiece for holding the target oral cavity in an expanded state is used.
  • the sound collecting unit 25 collects the target voice data. As the voice data, the voice data of the target at the time of examination, medical examination, etc. is collected.
  • the sound collecting unit 25 may be included in the first imaging unit 22.
  • the sound collecting unit 25 may convert the collected voice data into text data.
  • the sound collecting unit 25 converts the collected voice data into text, performs natural language processing, extracts disease information, and generates a label according to the reference data set.
  • ⁇ Server 3> Data (database) related to various information is stored in the server 3.
  • Information sent via, for example, the public communication network 4 is stored in this database.
  • the server 3 stores the same information as the information DB 11, and may send and receive various information to and from the evaluation device 1 via the public communication network 4.
  • a database server on the network may be used.
  • the server 3 may be used in place of the storage unit 104 and the information DB 11 described above.
  • the public communication network 4 (network) is an Internet network or the like to which the evaluation device 1 and the like are connected via a communication circuit.
  • the public communication network 4 may be composed of a so-called optical fiber communication network. Further, the public communication network 4 is not limited to the wired communication network, and may be realized by a wireless communication network.
  • FIG. 9 is a flowchart showing an example of the operation of the medical system 100 in the present embodiment.
  • the inspection information generation unit 20 generates inspection information including the first image (inspection information generation step S110).
  • the inspection information generation step S110 includes an irradiation step S111, a first imaging step S112, a second imaging step S113, and a measurement step S114.
  • the irradiation unit 21 irradiates an object that is a human or an animal with irradiation light (irradiation step S111). At this time, the holding portion 26 holds the target oral cavity in an expanded state.
  • the irradiation unit 21 stores the illuminance to be irradiated in the information DB 11 as imaging condition information.
  • the inspection information generation unit 20 generates imaging condition information.
  • First imaging step S112 Next, the first imaging unit 22 acquires a two-dimensional first image in the oral cavity of the target irradiated by the irradiation unit 21 (first imaging step S112). At this time, the first imaging unit 22 may acquire audio data. The first imaging unit 22 stores the first image in the information DB 11. The medical device 2 generates test information including the first image.
  • the second imaging unit 23 acquires a three-dimensional second image in the target oral cavity (second imaging step S113). At this time, the second imaging unit 23 may acquire audio data. The second imaging unit 23 stores the second image in the information DB 11. The medical device 2 generates test information including a second image.
  • the second imaging step S113 may be omitted. Further, when performing the second imaging step S113, the irradiation step S111 may be omitted.
  • the measurement unit 24 acquires the target measurement data (measurement step S114).
  • the measurement unit 24 stores the measurement data in the information DB 11.
  • the medical device 2 generates test information including measurement data.
  • the measurement step S114 may be omitted if necessary.
  • the order of the first imaging step S112, the second imaging step S113, and the measurement step S114 is arbitrary.
  • the medical device 2 generates input information including examination information including imaging condition information, a first image, a second image, and measurement data, and transmits the generated input information to the evaluation device 1.
  • the timing of transmitting the input information to the evaluation device 1 may be sequentially performed after the completion of each of the irradiation step S111, the first imaging step S112, the second imaging step S113, and the measurement step S114.
  • the inspection information generation unit 20 may generate shooting condition information and medical record data as necessary to generate input information including the shooting condition information and medical record data.
  • the acquisition unit 12 acquires input information including inspection information (acquisition step S120).
  • the acquisition unit 12 acquires the inspection information generated by the medical device 2 via the input unit 15, and may also acquire the input information via a storage medium such as a public communication network 4 or a portable memory. Good. Further, the acquisition unit 12 acquires the inspection information, the imaging condition information, and the medical record data generated by the medical device 2 via the input unit 15, and also via a storage medium such as a public communication network 4 or a portable memory. Then, inspection information, imaging condition information, and medical record data may be acquired.
  • the acquisition unit 12 may store the acquired input information and the like in the information DB 11.
  • the timing at which the acquisition unit 12 acquires the input information from the medical device 2 is arbitrary.
  • the imaging conditions and the like of the first image acquired by the acquisition unit 12 are arbitrary.
  • the acquisition unit 12 may acquire input information including at least one of the first image and the second image, imaging condition information, medical record data, and measurement data.
  • the acquisition unit 12 may acquire input information including, for example, a first image and a second image.
  • the acquisition unit 12 may acquire input information including, for example, a first image, imaging condition information, and a second image.
  • the acquisition unit 12 may acquire, for example, the first image, the imaging condition information, the second image, and the input information including the measurement data.
  • the evaluation unit 13 refers to the reference database and acquires the evaluation information including the first degree of association between the input information and the reference information (evaluation step S130).
  • the evaluation unit 13 acquires the input information from the acquisition unit 12 and acquires the reference database from the information DB 11.
  • the evaluation unit 13 can calculate the first degree of association between the input information and the reference information by referring to the reference database.
  • the evaluation unit 13 selects, for example, a past first image that matches, partially matches, or is similar to the input information, calculates the first degree of association based on the corresponding degree of association, and uses, for example, a reference database as a classifier. It may be used as an algorithm to calculate the first degree of association.
  • the evaluation unit 13 may store the calculated first degree of association and the acquired evaluation information in the information DB 11.
  • the evaluation unit 13 refers to, for example, the reference database shown in FIG. 6, and refers to the combination of at least one of the first image and the second image, the imaging condition information, and the measurement data, and the first association between the reference information.
  • the degree may be calculated.
  • the evaluation unit 13 may refer to the reference database shown in FIG. 6, for example, and calculate the first degree of association between the combination of the first image and the second image and the reference information.
  • the acquisition unit 12 may refer to, for example, the reference database shown in FIG. 6 and calculate the combination of the first image and the imaging condition information and the first degree of association between the reference information.
  • the acquisition unit 12 may refer to the reference database shown in FIG. 6 and calculate the combination of the first image, the measurement data, the imaging condition information, and the first degree of association between the reference information.
  • the evaluation unit 13 refers to, for example, the reference database shown in FIG. 7, and has a combination of at least one of the first image and the second image, imaging condition information, chart data, and measurement data, and the reference information.
  • the first degree of association may be calculated.
  • the evaluation unit 13 may refer to the reference database shown in FIG. 7, for example, and calculate the first degree of association between the combination of the first image and the second image and the reference information.
  • the acquisition unit 12 may refer to the reference database shown in FIG. 7 and calculate the combination of the first image, the chart data, the imaging condition information, and the first degree of association between the reference information.
  • the output unit 14 generates an evaluation result based on the evaluation information and outputs the evaluation result (output step S140).
  • the output unit 14 may acquire evaluation information from the evaluation unit 13 or the information DB 11, and may acquire, for example, a format for displaying the evaluation result from the information DB 11.
  • the output unit 14 generates an evaluation result by referring to a predetermined format such as a text format based on the evaluation information.
  • the output unit 14 outputs the evaluation result.
  • the output unit 14 outputs the evaluation result to the output unit 109.
  • the sound collecting unit 25 collects the voice of the doctor who muttered "It's influenza" when examining a patient suspected of having influenza as voice data.
  • the collected voice data is converted into text and processed in natural language.
  • the output unit 14 may output the voice data converted into text to the output unit 109 together with the output result.
  • the output unit 14 may output an evaluation result based on, for example, a result of comparing a preset notification reference value with the first degree of association.
  • a preset notification reference value for example, when the notification reference value is set to "90% or more", the evaluation result is output only when the first degree of association is 90% or more. That is, the notification reference value is an arbitrary threshold value, and conditions such as above or below the notification reference value can be arbitrarily set.
  • Update step S150 After that, when the update unit 16 newly acquires the relationship between the past input information and the reference information, the relationship may be reflected in the degree of association (update step S150). For example, the update unit 16 acquires update data newly created by the user and reflects it in the degree of association. In addition, the update unit 16 acquires, for example, learning data created by the user based on the evaluation result and reflects it in the degree of association. The update unit 16 calculates and updates the degree of association using machine learning, for example, and a convolutional neural network is used for machine learning, for example.
  • the operation of the medical system 100 in the present embodiment is completed. It is optional whether or not the above-mentioned update step S150 is performed. Further, as a medical program in the present embodiment, the computer may be made to execute the above operation.
  • the evaluation unit 13 refers to the reference database and acquires the evaluation information including the first degree of association between the input information including the first image and the reference information. Therefore, the reference information is associated with the input information, and the quantitative evaluation result of the disease can be obtained based on the input information. This makes it possible to improve the accuracy in disease evaluation. As a result, even a non-specialist can be embodied as a highly versatile device that is easy to use.
  • the input information includes a first image which is a two-dimensional image and a second image which is a three-dimensional image. Therefore, as compared with the case where either one of the first image and the second image is used, the disease can be evaluated based on the input information including the first image and the second image. This makes it possible to further improve the accuracy of disease evaluation.
  • the input information further includes the imaging condition information. Therefore, in addition to the first image, the disease can be evaluated based on the input information that further considers the imaging conditions. This makes it possible to further improve the accuracy of disease evaluation. In addition, the disease can be evaluated based on the input information including the standardized image in which the imaging condition information is prepared, and the disease can be evaluated with higher accuracy.
  • the input information further includes measurement data. Therefore, in addition to the first image, the disease can be evaluated based on the input information further considering the measurement data. This makes it possible to further improve the accuracy of disease evaluation.
  • the input information further includes medical record data. Therefore, in addition to the first image, the disease or the like can be evaluated based on the input information further considering the medical record data. This makes it possible to further improve the accuracy of evaluation of diseases and the like.
  • the first image or the second image includes incidental information. Therefore, it is possible to evaluate the disease or the like with higher accuracy than using the first image or the second image without incidental information. This makes it possible to further improve the accuracy of evaluation of diseases and the like.
  • the reference information includes disease information. Therefore, the reference information is associated with the input information, and the quantitative evaluation result regarding the disease can be obtained based on the input information. This makes it possible to improve the accuracy in disease evaluation. As a result, even a non-specialist can be embodied as a highly versatile device that is easy to use.
  • the reference information includes prescription drug information. Therefore, the reference information is linked to the input information, and the quantitative evaluation result of the prescription drug prescribed to the subject can be obtained based on the input information. This makes it possible to improve the accuracy in evaluating the prescription drug prescribed to the subject. As a result, even a non-specialist can be embodied as a highly versatile device that is easy to use.
  • the reference information includes the treatment information. Therefore, the reference information is associated with the input information, and the quantitative evaluation result regarding the treatment for the target can be obtained based on the input information. This makes it possible to improve the accuracy in evaluating the treatment for the subject. As a result, even a non-specialist can be embodied as a highly versatile device that is easy to use.
  • the reference information further includes advertising information. Therefore, the reference information is associated with the input information, and the quantitative evaluation result of the product or service to be advertised to the target can be obtained based on the input information. This makes it possible to propose optimal products and services to the target.
  • the holding unit 26 for holding the target in a predetermined state is provided.
  • the first image or the like can be captured while the object is held in a constant state. Therefore, it is possible to reduce the burden on the target when capturing the first image or the like.
  • the target is provided with an irradiation unit 21 that irradiates a predetermined irradiation light.
  • the target can be illuminated when the first image or the like is captured. Therefore, a predetermined illuminance condition can be satisfied regardless of the surrounding environment. As a result, the disease can be evaluated with higher accuracy.
  • the irradiation unit 21 irradiates the target oral cavity. This makes it possible to illuminate the oral cavity when the first image is taken. Therefore, the illuminance condition of the first image in the oral cavity can be satisfied, and the disease can be evaluated with higher accuracy. In particular, the oral cavity exhibits various mucosal findings and has a dark field. Therefore, by providing the irradiation unit 21, it is possible to evaluate the disease with higher accuracy.
  • the holding portion 26 is a tongue depressor or a mouthpiece for holding the target oral cavity in an expanded state. Therefore, the first image can be captured while the target's oral cavity is held in an expanded state, and the burden on the target when capturing the first image in the oral cavity can be reduced. Therefore, even a patient with a strong pharyngeal reflex or a pediatric patient can take a more reliable image.
  • the output unit 14 outputs an evaluation result based on the result of comparing the preset notification reference value with the first degree of association. Therefore, it is possible to control whether or not to output the evaluation result according to the setting condition of the notification reference value. As a result, the evaluation result can be output only when it is necessary, and the user or the like does not have to check the evaluation result at all times. Therefore, it is possible to reduce the workload of the user and the like.
  • the update unit 16 when the update unit 16 newly acquires the relationship between the past input information and the reference information, the update unit 16 reflects the relationship in the degree of association. Therefore, the degree of association can be easily updated, and the accuracy of evaluation can be improved.
  • the first image capturing unit 22 can capture the first image
  • the second imaging unit 23 can capture the second image
  • the measurement data of blood or the like can be measured at the same time. ..
  • deterioration of the sample can be prevented without transporting the sample such as blood for measuring the measurement data to the analysis room or the like. Therefore, the accuracy of disease evaluation can be improved, and it can be performed immediately in a medical examination room or the like.
  • the first imaging unit 22, the second imaging unit 23, the measuring unit 24, and the like are provided.
  • an evaluation result is generated based on the target two-dimensional first image, three-dimensional second image, and measurement data.
  • a portable device such as a smartphone as the evaluation device 1, it is possible to evaluate the disease regardless of the place and time.
  • the evaluation video data is evaluated based on the degree of association (first degree of association) set in three or more stages.
  • the degree of association can be described by a numerical value from 0 to 100%, for example, but is not limited to this, and may be configured at any stage as long as it can be described by a numerical value of 3 or more stages.
  • the present embodiment it is possible to evaluate without overlooking even when the degree of association is extremely low, for example, 1%. Even if the reference information has an extremely low degree of association, it is shown that the reference information is connected as a slight sign, and it is possible to suppress oversight and misunderstanding.
  • FIG. 10 is a schematic view showing the medical system 100 according to the second embodiment.
  • the evaluation device 1 for example, an electronic device such as a smartphone is used.
  • the evaluation device 1 includes a first imaging unit 22.
  • the medical device 2 is, for example, a medical device related to the urinary system.
  • the medical device 2 has an irradiation unit 21.
  • the medical device 2 may include an examination information generation unit 20.
  • FIG. 11 is a schematic view showing the medical system 100 according to the third embodiment.
  • the evaluation device 1 for example, an electronic device such as a smartphone is used, and the evaluation device 1 has an inspection information generation unit 20 including a first imaging unit 22.
  • the inspection information generation unit 20 may include a second imaging unit 23, a measurement unit 24, a sound collection unit 25, and the like.
  • the medical device 2 is, for example, a medical device related to the oral cavity.
  • the medical device 2 has a holding portion 26 made of a tongue depressor.
  • the medical device 2 has an irradiation unit 21 at the tip of the holding unit 26.
  • the medical device 2 further includes a handle portion 201, a trigger portion 202, and a support portion 203.
  • the medical device 2 may include an examination information generation unit 20.
  • the handle portion 201 is for holding the medical device 2 by the user's hand, and is formed in a predetermined shape such as a tubular shape.
  • the trigger portion 202 is connected to the handle portion 201 and can be pulled by the user's finger. By pulling the trigger unit 202, the user executes various functions such as the inspection information generation unit 20 and the evaluation device 1. For example, by pulling the trigger unit 202, the first imaging unit 22 is operated to capture the first image.
  • the support portion 203 is connected to the handle portion 201 and supports the evaluation device 1 in a predetermined state.
  • the support portion 203 can be fixed by sandwiching the evaluation device 1 made of an electronic device such as a smartphone, for example.
  • a holding portion 26 made of a tongue depressor can be attached to the support portion 203 on the opposite side of the handle portion 201.
  • the medical device 2 executes the function of the inspection information generation unit 20 by being connected to the handle unit 201 for the user to hold and pulling with the user's finger. It has a trigger portion 202 and a support portion 203 that is connected to the handle portion 201 to support the evaluation device 1.
  • the medical device 2 is formed in a so-called gun shape.
  • the inspection information generation unit 20 such as the first imaging unit 22 functions, it can be executed by pulling the trigger unit 202. .. Therefore, it is possible to improve the convenience when the user uses it.
  • FIG. 12 is a schematic view showing the medical system 100 according to the fourth embodiment.
  • the medical device 2 is, for example, a medical device related to the oral cavity.
  • the medical device 2 has a holding portion 26 made of a tongue depressor.
  • the medical device 2 may have an irradiation unit 21 in the holding unit 26 and may be embodied as a so-called glowing tongue depressor.
  • a holding portion 26 is attached to the tip of the handle portion 201, and a first imaging portion 22 is arranged and attached toward the holding portion 26.
  • the medical device 2 may include an examination information generation unit 20.
  • FIG. 13 is a schematic view showing the medical system 100 according to the fifth embodiment.
  • the medical device 2 is, for example, a medical device related to the oral cavity.
  • the medical device 2 has a holding portion 26 made of a tongue depressor.
  • the medical device 2 may have an irradiation unit 21 in the holding unit 26 and may be embodied as a so-called glowing tongue depressor.
  • the holding unit 26 has a plurality of first imaging units 22 on both sides thereof. Further, the holding unit 26 has a first imaging unit 22 at the tip end portion.
  • the medical device 2 may include an examination information generation unit 20.
  • the medical device 2 has a plurality of first imaging units 22 on both sides of the holding unit 26. As a result, when the image of the oral cavity is captured, the images of the upper side and the lower side of the oral cavity can be captured at the same time.
  • FIG. 14 is a schematic view showing the medical system 100 according to the sixth embodiment.
  • the medical device 2 is, for example, a medical device related to the skin.
  • the medical device 2 has a cover member 204 for covering the skin S.
  • the cover member 204 is formed, for example, in the shape of a hemisphere.
  • the irradiation unit 21 and the first imaging unit 22 are provided inside the cover member 204.
  • the medical device 2 may include an examination information generation unit 20.
  • the medical device 2 is connected to the evaluation device 1 via, for example, a cable or the like.
  • the medical device 2 has a cover member 204 for covering the skin S, and has an irradiation unit 21 and a first imaging unit 22 inside the cover member 204. This makes it possible to reduce the influence of ambient light when capturing an image of the skin. Therefore, the evaluation of the disease can be performed more accurately.
  • Evaluation device 10 Housing 11: Information DB 12: Acquisition unit 13: Evaluation unit 14: Output unit 15: Input unit 16: Update unit 2: Medical device 20: Inspection information generation unit 21: Irradiation unit 22: First imaging unit 23: Second imaging unit 24: Measurement unit 25: Sound collecting unit 26: Holding unit 201: Handle unit 202: Trigger unit 203: Support unit 204: Cover member 3: Server 4: Public communication network 100: Medical system 101: CPU 102: ROM 103: RAM 104: Storage unit 105: I / F 106: I / F 107: I / F 108: Input part 109: Output part 110: Internal bus S110: Inspection information generation step S111: Irradiation step S112: First imaging step S113: Second imaging step S114: Measurement step S120: Acquisition step S130: Evaluation step S140: Output step S150: Update step

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • Public Health (AREA)
  • Medical Informatics (AREA)
  • Biomedical Technology (AREA)
  • Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Biophysics (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Veterinary Medicine (AREA)
  • Animal Behavior & Ethology (AREA)
  • Optics & Photonics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Molecular Biology (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)
PCT/JP2020/027992 2019-07-26 2020-07-20 医療システム、医療デバイス及び医療プログラム WO2021020187A1 (ja)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2021536951A JPWO2021020187A1 (ko) 2019-07-26 2020-07-20

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019137905 2019-07-26
JP2019-137905 2019-07-26

Publications (1)

Publication Number Publication Date
WO2021020187A1 true WO2021020187A1 (ja) 2021-02-04

Family

ID=74228297

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/027992 WO2021020187A1 (ja) 2019-07-26 2020-07-20 医療システム、医療デバイス及び医療プログラム

Country Status (2)

Country Link
JP (1) JPWO2021020187A1 (ko)
WO (1) WO2021020187A1 (ko)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH023535B2 (ko) * 1980-12-26 1990-01-24 Nippon Denshin Denwa Kk
CN106295139A (zh) * 2016-07-29 2017-01-04 姹ゅ钩 一种基于深度卷积神经网络的舌体自诊健康云服务系统
JP2018175343A (ja) * 2017-04-12 2018-11-15 富士フイルム株式会社 医用画像処理装置および方法並びにプログラム
WO2018225448A1 (ja) * 2017-06-09 2018-12-13 智裕 多田 消化器官の内視鏡画像による疾患の診断支援方法、診断支援システム、診断支援プログラム及びこの診断支援プログラムを記憶したコンピュータ読み取り可能な記録媒体
WO2019131327A1 (ja) * 2017-12-28 2019-07-04 アイリス株式会社 口内撮影装置、医療装置及びプログラム
US20190216333A1 (en) * 2018-01-12 2019-07-18 Futurewei Technologies, Inc. Thermal face image use for health estimation

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN203914853U (zh) * 2014-06-28 2014-11-05 吕清林 一种数码口腔内窥镜
CN204581215U (zh) * 2015-04-21 2015-08-26 杨海蓉 内科用压舌板检查辅助装置
JP6503535B1 (ja) * 2018-12-17 2019-04-17 廣美 畑中 医用画像をaiの判断で症状度合いごと画像に表示する診断方法。
CN109770824A (zh) * 2018-12-25 2019-05-21 天津大学 儿童用压舌板式喉像采集装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH023535B2 (ko) * 1980-12-26 1990-01-24 Nippon Denshin Denwa Kk
CN106295139A (zh) * 2016-07-29 2017-01-04 姹ゅ钩 一种基于深度卷积神经网络的舌体自诊健康云服务系统
JP2018175343A (ja) * 2017-04-12 2018-11-15 富士フイルム株式会社 医用画像処理装置および方法並びにプログラム
WO2018225448A1 (ja) * 2017-06-09 2018-12-13 智裕 多田 消化器官の内視鏡画像による疾患の診断支援方法、診断支援システム、診断支援プログラム及びこの診断支援プログラムを記憶したコンピュータ読み取り可能な記録媒体
WO2019131327A1 (ja) * 2017-12-28 2019-07-04 アイリス株式会社 口内撮影装置、医療装置及びプログラム
US20190216333A1 (en) * 2018-01-12 2019-07-18 Futurewei Technologies, Inc. Thermal face image use for health estimation

Also Published As

Publication number Publication date
JPWO2021020187A1 (ko) 2021-02-04

Similar Documents

Publication Publication Date Title
US11298072B2 (en) Dermoscopy diagnosis of cancerous lesions utilizing dual deep learning algorithms via visual and audio (sonification) outputs
JP6878628B2 (ja) 生理学的モニタのためのシステム、方法、及びコンピュータプログラム製品
US20210358582A1 (en) Devices, methods, and systems for acquiring medical diagnostic information and provision of telehealth services
JP2022002157A (ja) 医療診断情報を取得するための装置、方法、およびシステム、ならびに遠隔医療サービスの提供
US20170007126A1 (en) System for conducting a remote physical examination
US20220351859A1 (en) User interface for navigating through physiological data
US20200383582A1 (en) Remote medical examination system and method
US20220005601A1 (en) Diagnostic device for remote consultations and telemedicine
Marom et al. Emerging technologies for the diagnosis of otitis media
Park et al. Optical assessment of the in vivo tympanic membrane status using a handheld optical coherence tomography-based otoscope
Huynh Smartphone-based device in exotic pet medicine
JP2006149679A (ja) 健康度判定方法、装置、及びプログラム
WO2021020187A1 (ja) 医療システム、医療デバイス及び医療プログラム
RU2013109085A (ru) Способ и устройство неинвазивного контроля уровня глюкозы в крови
US20210275033A1 (en) System, method, and apparatus for temperature asymmetry measurement of body parts
MX2023000716A (es) Sistema y metodo para la asistencia en el diagnostico de enfermedades otorrinolaringologicas a partir del analisis de imagenes.
DE202016105331U1 (de) System zur Durchführung einer körperlichen Fernuntersuchung
Khalili Moghaddam et al. Ex vivo biosignatures
Ferreira et al. Design of a prototype remote medical monitoring system for measuring blood pressure and glucose measurement
TW202005609A (zh) 口腔影像分析系統和方法
JPWO2021020187A5 (ko)
O'Brien Investigation of cervical remodeling during pregnancy with in vivo Raman spectroscopy
EP4244869A1 (en) Clinical warning score system
KR20220124555A (ko) 적외선을 이용한 유방검사기

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20846456

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021536951

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 25/05/2022)

122 Ep: pct application non-entry in european phase

Ref document number: 20846456

Country of ref document: EP

Kind code of ref document: A1