CN114511490A - Image processing apparatus and image processing program - Google Patents
Image processing apparatus and image processing program Download PDFInfo
- Publication number
- CN114511490A CN114511490A CN202111356494.3A CN202111356494A CN114511490A CN 114511490 A CN114511490 A CN 114511490A CN 202111356494 A CN202111356494 A CN 202111356494A CN 114511490 A CN114511490 A CN 114511490A
- Authority
- CN
- China
- Prior art keywords
- image
- blood vessel
- image processing
- sound
- processing apparatus
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000012545 processing Methods 0.000 title claims abstract description 79
- 210000004204 blood vessel Anatomy 0.000 claims abstract description 174
- 238000007689 inspection Methods 0.000 claims description 39
- 230000005856 abnormality Effects 0.000 claims description 35
- 230000017531 blood circulation Effects 0.000 claims description 32
- 238000012806 monitoring device Methods 0.000 claims 1
- 238000010586 diagram Methods 0.000 description 26
- 230000002159 abnormal effect Effects 0.000 description 17
- 230000008859 change Effects 0.000 description 11
- 238000004891 communication Methods 0.000 description 11
- 238000000034 method Methods 0.000 description 9
- 238000003672 processing method Methods 0.000 description 9
- 208000031481 Pathologic Constriction Diseases 0.000 description 8
- 238000001514 detection method Methods 0.000 description 8
- 230000036262 stenosis Effects 0.000 description 8
- 208000037804 stenosis Diseases 0.000 description 8
- 230000008569 process Effects 0.000 description 6
- 239000004973 liquid crystal related substance Substances 0.000 description 5
- 239000008280 blood Substances 0.000 description 4
- 210000004369 blood Anatomy 0.000 description 4
- 238000000502 dialysis Methods 0.000 description 4
- 206010057469 Vascular stenosis Diseases 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 238000001631 haemodialysis Methods 0.000 description 3
- 230000000322 hemodialysis Effects 0.000 description 3
- 230000010365 information processing Effects 0.000 description 3
- 238000003491 array Methods 0.000 description 2
- 210000001367 artery Anatomy 0.000 description 2
- 238000002555 auscultation Methods 0.000 description 2
- 230000008878 coupling Effects 0.000 description 2
- 238000010168 coupling process Methods 0.000 description 2
- 238000005859 coupling reaction Methods 0.000 description 2
- 201000010099 disease Diseases 0.000 description 2
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 2
- 230000005674 electromagnetic induction Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000002559 palpation Methods 0.000 description 2
- 210000003462 vein Anatomy 0.000 description 2
- 239000003086 colorant Substances 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 230000000474 nursing effect Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000001681 protective effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B7/00—Instruments for auscultation
- A61B7/02—Stethoscopes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B7/00—Instruments for auscultation
- A61B7/02—Stethoscopes
- A61B7/04—Electric stethoscopes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61M—DEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
- A61M1/00—Suction or pumping devices for medical purposes; Devices for carrying-off, for treatment of, or for carrying-over, body-liquids; Drainage systems
- A61M1/14—Dialysis systems; Artificial kidneys; Blood oxygenators ; Reciprocating systems for treatment of body fluids, e.g. single needle systems for hemofiltration or pheresis
- A61M1/16—Dialysis systems; Artificial kidneys; Blood oxygenators ; Reciprocating systems for treatment of body fluids, e.g. single needle systems for hemofiltration or pheresis with membranes
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
- G16H20/40—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30101—Blood vessel; Artery; Vein; Vascular
- G06T2207/30104—Vascular flow; Blood flow; Perfusion
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Public Health (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Heart & Thoracic Surgery (AREA)
- Veterinary Medicine (AREA)
- Urology & Nephrology (AREA)
- Acoustics & Sound (AREA)
- Biomedical Technology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Primary Health Care (AREA)
- Epidemiology (AREA)
- Radiology & Medical Imaging (AREA)
- Molecular Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Emergency Medicine (AREA)
- Vascular Medicine (AREA)
- Anesthesiology (AREA)
- Hematology (AREA)
- General Physics & Mathematics (AREA)
- Quality & Reliability (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
- External Artificial Organs (AREA)
- User Interface Of Digital Computer (AREA)
- Image Analysis (AREA)
Abstract
The invention provides an image processing device and an image processing program, which can input the position of a blood vessel on an image when the blood vessel of a patient is detected by the image processing device. Provided is an image processing device provided with: an image display unit that displays an image of at least a part of the body of the patient including the position where the shunt tube is provided; and a first input unit for inputting, on the image displayed on the image display unit, position information on the image of at least one of a position of a blood vessel in the body of the patient and an examination position at which the blood vessel has been examined.
Description
Technical Field
The present invention relates to an image processing apparatus and an image processing program.
Background
Documents of the prior art
Patent document
Patent document 1: japanese patent laid-open publication No. 2006-346333
Disclosure of Invention
In a first aspect of the present invention, an image processing apparatus is provided. The image processing apparatus includes: an image display unit that displays an image of at least a part of the body of the patient including the position where the shunt tube is provided; and a first input unit for inputting, on the image displayed on the image display unit, position information on the image of at least one of a position of a blood vessel in the body of the patient and an examination position at which the blood vessel has been examined.
The first input unit may input position information of at least one of a position of the blood vessel and an examination position on the image through an electrical connection or an electromagnetic connection between the first input unit and the image display unit.
The image processing apparatus may further include an image acquisition unit that acquires an image.
The image processing apparatus may further include a second input unit for inputting information of the patient. The image acquiring unit may acquire the image when the information of the patient is input through the second input unit.
The image processing apparatus may further include a storage unit that stores the position information input by the first input unit in association with the image.
The image processing apparatus may further include a determination unit configured to determine an abnormality of the blood vessel based on the image.
The storage unit may store a first image in which at least one of the position of the blood vessel and the inspection position at a first time is depicted, and a second image in which at least one of the position of the blood vessel and the inspection position at a second time after a predetermined time has elapsed from the first time is depicted. The determination unit may determine the abnormality of the blood vessel based on the first image and the second image.
The storage section may further store a bloodstream sound in the blood vessel. The determination unit may determine the abnormality of the blood vessel based on the image and the blood flow sound.
The storage unit may further store a first bloodstream sound which is a bloodstream sound at a first time, and a second bloodstream sound which is a bloodstream sound at a second time after a predetermined time has elapsed from the first time. The determination unit may determine the abnormality of the blood vessel based on the first bloodstream sound and the second bloodstream sound.
The image processing apparatus may further include a blood flow sound acquisition unit that acquires blood flow sound.
In a second aspect of the present invention, an image processing program is provided. The image processing program causes a computer to function as an image processing apparatus.
The summary of the present invention does not list all the necessary features of the present invention. In addition, sub-combinations of these feature groups can also be inventions.
Drawings
Fig. 1 is a diagram showing an example of an image processing apparatus 100 according to an embodiment of the present invention.
Fig. 2 is a diagram showing an example of a block diagram of the image processing apparatus 100 according to the embodiment of the present invention.
Fig. 3 is a view showing an example of a cross section of the image display unit 10 shown in fig. 1.
Fig. 4 is a view showing another example of a cross section of the image display unit 10 shown in fig. 1.
Fig. 5 is a diagram showing an example of at least a part of the body 22 of the patient.
Fig. 6 is a diagram showing an example of an image 11 of at least a part of a body 22 of a patient displayed on the image display unit 10.
Fig. 7 is a diagram showing an example of the first image 11-1 of at least a part of the body 22 of the patient displayed on the image display unit 10.
Fig. 8 is a diagram showing an example of the second image 11-2 of at least a part of the body 22 of the patient displayed on the image display unit 10.
Fig. 9 is a diagram showing another example of the block diagram of the image processing apparatus 100 according to the embodiment of the present invention.
Fig. 10 is a diagram illustrating an example of an image processing method according to an embodiment of the present invention.
Fig. 11 is a diagram showing an example of a computer 2200 that can wholly or partially embody the image processing apparatus 100 of the present invention.
Detailed Description
The present invention will be described below with reference to embodiments of the invention, but the following embodiments do not limit the invention according to the claims. All combinations of the features described in the embodiments are not necessarily essential to the solution of the invention.
Fig. 1 is a diagram showing an example of an image processing apparatus 100 according to an embodiment of the present invention. The image processing apparatus 100 includes an image display unit 10 and a first input unit 20. The image processing apparatus 100 may include a second input unit 40 and a blood flow sound acquisition unit 50. The first input unit 20 is, for example, a stylus. The blood flow sound acquiring unit 50 is, for example, a stethoscope. The blood flow sound acquiring section 50 may be an electronic stethoscope.
The image processing apparatus 100 includes a main body 90. The main body 90 includes the image display unit 10. The main body portion 90 may have a second input portion 40. The main body 90 is a computer including a CPU, a memory, an interface, and the like. The computer may be a tablet-type computer, a notebook computer, or a smart phone.
The image processing apparatus 100 may or may not include the image acquisition unit 30. The image acquisition unit 30 is, for example, an electronic camera. The image acquisition unit 30 acquires the image 11.
When the image processing apparatus 100 includes the image acquiring unit 30, the main body 90 preferably includes the image acquiring unit 30. In the case where the main body section 90 is a tablet-type computer, the image acquisition section 30 may be an electronic camera built in the tablet-type computer. When the image processing apparatus 100 does not include the image acquisition unit 30, the image 11 may be acquired by the image acquisition unit 30 other than the image processing apparatus 100, as will be described later.
When the main body 90 is a computer, the image display unit 10 may be a monitor, a display, or the like of the computer, and the second input unit 40 may be at least one of a keyboard and a mouse of the computer. An image processing program for causing the main body 90 to function as the image processing apparatus 100 may be installed in the main body 90.
The image display unit 10 displays an image 11. The image display unit 10 of this example has a front surface 12 and a rear surface 14. The front surface 12 is a surface on which the image 11 is displayed on the image display unit 10. The rear surface 14 is a surface facing the front surface 12 in a direction intersecting the front surface 12, and is a surface on which the image 11 is not displayed. When the user uses the main body 90, the front surface 12 and the rear surface 14 in this example are visually recognizable and invisible to the user, respectively.
The main body 90 and the first input portion 20 may be connected by wireless. The main body 90 and the blood flow sound acquiring unit 50 may be connected by wireless. The wireless standard may be Bluetooth (registered trademark), Wi-Fi (registered trademark), or Zigbee (registered trademark). The main body 90 and the first input unit 20 may be connected by a wire. The main body 90 and the blood flow sound acquiring unit 50 may be connected by a wire.
Fig. 2 is a diagram showing an example of a block diagram of the image processing apparatus 100 according to the embodiment of the present invention. In this example, the main body 90 includes the control unit 80. When the main body 90 is a computer, the control unit 80 is, for example, a CPU of the computer. In fig. 2, the range of the body portion 90 in fig. 1 is indicated by a broken line.
The control unit 80 controls the display of the image 11 on the image display unit 10. In this example, the image 11 is an image acquired by the image acquisition unit 30. The control unit 80 controls the acquisition of the image 11 by the image acquisition unit 30. The control unit 80 controls the position information on the image 11 input through the first input unit 20. The control unit 80 controls the patient information (described later) input through the second input unit 40. The control unit 80 controls the information of the bloodstream sound acquired by the bloodstream sound acquisition unit 50 (described later).
Fig. 3 is a view showing an example of a cross section of the image display unit 10 shown in fig. 1. Fig. 3 shows a state in which the first input unit 20 is being used by the hand of the user. The first input unit 20 inputs position information (described later) on the image 11 displayed on the image display unit 10 by electrical or electromagnetic connection between the first input unit 20 and the image display unit 10. The image display unit 10 may be a touch panel type input device.
The image display unit 10 of this example includes a protection unit 15, an electrostatic detection unit 16, and a liquid crystal unit 17 in a direction from the front surface 12 to the back surface 14. The protection portion 15 is, for example, protection glass. The liquid crystal section 17 is, for example, a liquid crystal panel. The front surface 12 of the image display portion 10 may be a surface of the protective portion 15.
The image display unit 10 of this example is an electrostatic coupling type touch panel. In this example, the electrostatic detection section 16 and the first input section 20 are electrically connected by electrostatic coupling.
Fig. 4 is a view showing another example of a cross section of the image display unit 10 shown in fig. 1. Fig. 4 shows a state where the first input unit 20 is being used by the hand of the user, as in fig. 3. The image display unit 10 of this example includes a protection unit 15, a magnetic field detection unit 18, and a liquid crystal unit 17 in a direction from the front surface 12 to the back surface 14.
The image display unit 10 of this example is an electromagnetic induction type touch panel. The magnetic field detection section 18 may have a plurality of coils. The first input 20 may have a plurality of other coils. In this example, the magnetic field detection unit 18 and the first input unit 20 are electromagnetically connected by electromagnetic induction. In this example, the coil of the magnetic field detection unit 18 and the coil of the first input unit 20 are electromagnetically connected.
Fig. 5 is a diagram showing an example of at least a part of the body 22 of the patient. In this example, the patient is assumed to be undergoing artificial dialysis treatment. In this example, at least a portion of the patient's body 22 is the patient's arm. At least a portion of the patient's body 22 may be an arm on a non-handedness side of the patient.
In an artificial dialysis treatment, a predetermined amount of the patient's blood is drawn out of the patient's body 22 per unit time by placing a shunt 24 in the patient's body. Shunt 24 may be embedded in the body 22 of the patient. The shunt 24 can be connected to a blood vessel 26 of a patient inside the patient's body 22. The blood vessel 26 is at least one of a vein and an artery of the patient. In fig. 5, the vessel 26 and shunt 24 are shown in phantom and solid lines, respectively.
With the shunt 24 embedded in the patient's body 22, the shunt 24 cannot be visually identified. In fig. 5, the shunt 24 is visually represented to show the location in the patient's body 22 where the shunt 24 is positioned.
Sometimes at least one of a nurse, a clinical laboratory technician, a doctor, and a patient examines the patient's blood vessel 26. In the present specification, at least one of a nurse, a clinical laboratory technician, a doctor, and a patient is referred to as a nurse or the like. A nurse or the like can check whether a vascular stenosis is produced in the patient's blood vessel 26.
The blood vessel 26 may be examined by at least one of auscultation and palpation. Whether or not a vascular stenosis is generated in the blood vessel 26 can be checked by at least one of auscultation and palpation. In the case of auscultating the blood vessel 26, the blood vessel 26 may be auscultated by the blood flow sound acquiring unit 50 (see fig. 1 and 2).
The position where the blood vessel 26 is examined is set as an examination position 28. The examination location 28 is the location along the blood vessel 26 at which the examination was performed. In the examination of the blood vessel 26, a plurality of locations along the blood vessel 26 may be examined. In this example, two locations (examination location 28-1 and examination location 28-2) in blood vessel 26 are examined.
Fig. 6 is a diagram showing an example of an image 11 of at least a part of a body 22 of a patient displayed on the image display unit 10. In this example, the image acquisition unit 30 (see fig. 1 and 2) acquires the image 11. Fig. 6 shows an example of a blood vessel position 27 and an examination position 29 on the image 11.
The blood vessel position 27 represents position information of the blood vessel 26 on the image 11. The blood vessel position 27 is position information on the image 11 corresponding to the position of the blood vessel 26 in the body 22 of the patient. The blood vessel position 27 may be position information indicating a so-called position of the blood vessel operation of the patient. The vessel position 27 may or may not coincide with the position of the vessel 26 on the image 11.
The inspection position 29 indicates position information of the inspection position 28 on the image 11. The examination position 29 is position information on the image 11 corresponding to the examination position 28 in the body 22 of the patient. The inspection position 29 may or may not coincide with the position of the inspection position 28 on the image 11. Inspection position 29-1 and inspection position 29-2 correspond to inspection position 28-1 and inspection position 28-2, respectively.
The vessel position 27 and the examination position 29 can be depicted on the image 11. The vessel position 27 and the examination position 29 may be depicted by a nurse or the like. The blood vessel position 27 can be depicted by tracing the blood vessel 26 displayed on the image display unit 10. The inspection position 29 may be depicted at a position corresponding to the inspection position 28 (refer to fig. 5).
The blood vessel position 27 and the examination position 29 depicted on the image 11 may be displayed in black and white, or may be displayed in colors other than black and white, such as red, green, and blue. In the image 11, the color of the blood vessel position 27 in the case where the blood vessel 26 is shown as an artery may be different from the color of the blood vessel position 27 in the case where the blood vessel 26 is shown as a vein.
The first input unit 20 inputs at least one of the blood vessel position 27 and the examination position 29 on the image 11. In this example, the first input unit 20 inputs at least one of the blood vessel position 27 and the examination position 28 on the image 11 by electrical connection (see fig. 3) or electromagnetic connection (see fig. 4) between the first input unit 20 and the image display unit 10.
In the case where the main body portion 90 is a tablet-type computer, the image display portion 10 may be a touch panel. In the case of a tablet-type computer as the main body portion 90, the blood vessel position 27 may be drawn on the touch panel so as to trace the blood vessel 26 displayed on the touch panel. An examination position 29 can be depicted on the touch panel at the position where the blood vessel 26 is examined. In the case where the main body portion 90 is not a tablet-type computer, at least one of the blood vessel position 27 and the examination position 29 may be drawn on the image 11 by a mouse, a pointing stick, or the like.
As described above, the image processing apparatus 100 may or may not include the image acquisition unit 30. In the case where the image processing apparatus 100 does not include the image acquisition unit 30, the image 11 may be acquired by the image acquisition unit 30 other than the image processing apparatus 100. For example, when a patient holds a digital camera and the image processing apparatus 100 is installed in a hospital, the image 11 acquired by the digital camera of the patient at the home of the patient may be stored in the storage unit 60 (described later) of the image processing apparatus 100. When a plurality of patients each have a digital camera and the image processing apparatus 100 is installed in a hospital, each image 11 acquired by the digital camera of each patient at the home of each of the plurality of patients may be stored in the storage unit 60 (described later) of the image processing apparatus 100. The image 11 may be displayed in the image display section 10 in the image processing apparatus 100. The image 11 may be transmitted from the digital camera of the patient to the image processing apparatus 100 set in the hospital through a communication line.
In the case where the image processing apparatus 100 does not include the image acquisition unit 30, the image 11 may be an image obtained by converting a drawing on paper into image data. An image 11 obtained by converting a drawing on paper into image data can be stored in a storage unit 60 (described later) of the image processing apparatus 100.
In the image processing apparatus 100 of this example, the first input unit 20 inputs at least one of the blood vessel position 27 and the examination position 29 on the image 11. Therefore, the image processing apparatus 100 of the present example can easily input at least one of the blood vessel position 27 and the inspection position 29, compared to the case where at least one of the blood vessel position 27 and the inspection position 29 is described as a drawing on paper. In addition, the drawings are drawings obtained by drawing a body part.
Fig. 7 is a diagram showing an example of the first image 11-1 of at least a part of the body 22 of the patient displayed on the image display unit 10. The first image 11-1 is the image 11 acquired by the image acquisition unit 30 at the first time t 1.
In this example, the blood vessel position 27 is the position information of the blood vessel 26 on the image 11-1, and is the position information of the blood vessel 26 at the first time t 1. In this example, the inspection position 29 is position information of the inspection position 28 on the image 11-1 and is position information of the inspection position 28 at the first time t 1. In this example, the first image 11-1 is an image 11 depicting at least one of the blood vessel position 27 and the examination position 29.
Fig. 8 is a diagram showing an example of the second image 11-2 of at least a part of the body 22 of the patient displayed on the image display unit 10. The second image 11-2 is the image 11 acquired by the image acquiring unit 30 at the second time t 2. In this example, the second time t2 is a time after a predetermined time has elapsed from the first time t 1.
The first time t1 and the second time t2 may be timings of checking whether or not a vascular stenosis is generated in the blood vessel 26. The blood vessel 26 may be examined by a nurse or the like at each of the first time t1 and the second time t 2. The predetermined time may be an interval from the inspection at the first time t1 to the next inspection at the second time t 2.
The position information of the blood vessel 26 at the second time t2 on the image 11-2 is set as the blood vessel position 37. The position information of the inspection position 28 at the second time t2 on the image 11-2 is set as the inspection position 39.
If a stenosis is generated in the blood vessel 26, it is difficult for blood to flow in the blood vessel 26. Therefore, the shape of the blood vessel 26 in the case where the blood vessel stenosis occurs may be changed from the shape of the blood vessel 26 before the blood vessel stenosis occurs. When the shape of the blood vessel 26 changes, the position of the blood vessel 26 in the body 22 of the patient may change.
In this example, the position of the blood vessel 26 in the body 22 of the patient at the second time t2 is changed compared to the position at the first time t 1. The location of the blood vessel 26 in the first image 11-1 is different from the location of the blood vessel 26 in the second image 11-2.
At a second time t2, the blood vessel position 37 may be depicted on the second image 11-2 in a manner tracing the blood vessel 26. At a second time t2, inspection location 39 may be depicted at a location on second image 11-2 that corresponds to inspection location 28 (see FIG. 5). In this example, the second image 11-2 is an image 11 depicting at least one of the blood vessel position 37 and the examination position 39.
At a second time t2, a plurality of positions along the blood vessel 26 may also be examined. In this example, two locations (examination location 28-1 and examination location 28-2) in blood vessel 26 are examined. Inspection position 39-1 and inspection position 39-2 correspond to inspection position 28-1 and inspection position 28-2, respectively.
Fig. 9 is a diagram showing another example of the block diagram of the image processing apparatus 100 according to the embodiment of the present invention. The image processing apparatus 100 of this example is different from the image processing apparatus 100 shown in fig. 2 in that it further includes a storage unit 60 and a determination unit 70. In this example, the main body 90 includes the storage unit 60 and the determination unit 70.
The storage unit 60 stores the position information input through the first input unit 20 in association with the image 11. In the example shown in fig. 6, the position information input through the first input unit 20 is at least one of the blood vessel position 27 and the examination position 29. The blood vessel position 27 is position information of the blood vessel 26 on the image 11. The inspection position 29 is position information of the inspection position 28 on the image 11.
The storage unit 60 may store the first image 11-1 (see fig. 7) at the first time t1 and the second image 11-2 (see fig. 8) at the second time t 2. The storage unit 60 may store the images 11 at three or more times when the blood vessel 26 is examined.
The image processing apparatus 100 of this example includes a second input unit 40 for inputting information of a patient to be examined. The user of the image processing apparatus 100 can input information of the patient as the examination target through the second input unit 40. The information on the patient to be examined may be information on the patient's age, sex, height, weight, previous disease, and the like. An identifier (for example, an ID number) for identifying a patient is set as an identifier M. The storage unit 60 may store the identifier M in association with the information of the patient. The storage unit 60 may store the identifier M in association with the information of the patient for a plurality of patients.
The storage unit 60 may store the identifier M in association with the image 11. The storage unit 60 may store the identifier M, the image 11, and at least one of the blood vessel position 27 and the examination position 29 in association with each other. In the Shunt failure scoring (STS), it is preferable to store at least one of the blood vessel position 27 and the inspection position 29, the identifier M, and the image 11 in association with each other. By storing at least one of the blood vessel position 27 and the examination position 29, the identifier M, and the image 11 in association with each other in advance, the image processing apparatus 100 can easily manage at least one of the blood vessel position 27 and the examination position 29, and the image 11 for each patient.
The storage unit 60 may store the identifier M in association with a plurality of images 11 (for example, the first image 11-1 and the second image 11-2). The storage unit 60 may store at least one of the blood vessel position 27 and the examination position 29, the identifier M, and the first image 11-1 in association with each other. The storage unit 60 may store at least one of the blood vessel position 37 and the examination position 39, the identifier M, and the second image 11-2 in association with each other.
The blood flow sound in the blood vessel 26 is referred to as blood flow sound S. The storage unit 60 may further store the blood flow sound S. The bloodstream sound S is a sound of a bloodstream output from the heart by the heartbeat of the heart of the patient. The bloodstream sound S may be a so-called shunt sound. In this example, the shunt sound is the flow sound of blood flowing in the shunt 24. The bloodstream sound S may be a sound acquired at the examination position 28 (see fig. 5). The bloodstream sound S can be acquired by the bloodstream sound acquisition unit 50. The storage unit 60 may store the bloodstream sound S in association with at least one of the blood vessel position 27 and the inspection position 29.
The bloodstream sound S at the first time t1 is set as a first bloodstream sound S1. The bloodstream sound S at the second time t2 is set as a second bloodstream sound S2. The storage section 60 may store the first bloodstream sound S1 and the second bloodstream sound S2. The storage unit 60 may store the bloodstream sound S at each of three or more times when the blood vessel 26 is examined.
The storage unit 60 may store the identifier M, the image 11, and the bloodstream sound S in association with each other. The storage unit 60 may store the identifier M, the image 11, and at least one of the blood vessel position 27 and the examination position 29 in association with each other.
The storage unit 60 may store the identifier M, the plurality of images 11 (for example, the first image 11-1 and the second image 11-2), and the plurality of bloodstream sounds S (for example, the first bloodstream sound S1 and the second bloodstream sound S2) in association with each other. The storage unit 60 may store the identifier M, the first image 11-1, and the first bloodstream sound S1 in association with at least one of the blood vessel position 27 and the examination position 29. The storage unit 60 may store the identifier M, the second image 11-2, and the second bloodstream sound S2 in association with at least one of the blood vessel position 37 and the examination position 39.
The image processing apparatus 100 may or may not include the bloodstream sound acquisition unit 50. When the image processing apparatus 100 does not include the bloodstream sound acquisition unit 50, the bloodstream sound S may be acquired by the bloodstream sound acquisition unit 50 other than the image processing apparatus 100. For example, in a case where a patient holds a stethoscope and the image processing apparatus 100 is installed in a hospital, the blood flow sounds S acquired through the stethoscope of the patient in the home of the patient may be stored in the storage unit 60 of the image processing apparatus 100. The blood flow sound S may be transmitted from the stethoscope of the patient to the image processing apparatus 100 set in the hospital through a communication line.
The determination unit 70 may determine abnormality of the blood vessel 26 based on the image 11. In this example, the abnormality of the blood vessel 26 refers to a state in which a blood vessel stenosis has occurred, such as a blood flow rate required for artificial dialysis not being maintained.
The determination unit 70 may determine the abnormality of the blood vessel 26 based on the first image 11-1 and the second image 11-2. As described above, in the case where the blood vessel 26 has generated a stenosis, the position of the blood vessel 26 in the first image 11-1 and the position of the blood vessel 26 in the second image 11-2 are sometimes different. Therefore, the determination unit 70 can determine the abnormality of the blood vessel 26 based on the position of the blood vessel 26 in the first image 11-1 and the position of the blood vessel 26 in the second image 11-2. The determination unit 70 may determine abnormality of the blood vessel 26 based on the images 11 at three or more times.
The determination unit 70 may determine abnormality of the blood vessel 26 based on the bloodstream sound S. The determination unit 70 may determine the abnormality of the blood vessel 26 based on the frequency characteristic of the bloodstream sound S. The frequency characteristic of the bloodstream sound S may be the frequency dependence of the amplitude in the waveform of the bloodstream sound S. The determination unit 70 may determine that the blood vessel 26 is abnormal when the amplitude in the predetermined frequency band is smaller than a predetermined amplitude threshold tha. The amplitude threshold tha may be an amplitude of a predetermined ratio with respect to a maximum value of the amplitude in the frequency dependence of the amplitude. The predetermined ratio may be, for example, 80% or 60%. The frequency characteristic of the bloodstream sound S may be frequency-dependent of the amplitude of the envelope in the waveform of the bloodstream sound S.
The determination unit 70 may determine the abnormality of the blood vessel 26 based on the amplitude characteristic of the bloodstream sound S. The amplitude characteristic of the bloodstream sound S may be an amplitude characteristic of the waveform of the bloodstream sound S or an amplitude characteristic of an envelope in the waveform of the bloodstream sound S. The determination unit 70 may determine that the blood vessel 26 is abnormal when the minimum value of the amplitude in the envelope is smaller than a predetermined amplitude threshold thb. The amplitude threshold thb may be an amplitude at a predetermined ratio to the maximum value of the amplitude in the envelope. The predetermined ratio may be, for example, 80% or 60%.
The determination unit 70 may determine the abnormality of the blood vessel 26 based on the first bloodstream sound S1 and the second bloodstream sound S2. In the case where the blood vessel 26 develops a stenosis, it is difficult for blood to flow in the blood vessel 26. Therefore, at least one of the frequency characteristic and the amplitude characteristic of the bloodstream sound S when the blood vessel stenosis has occurred may change from at least one of the frequency characteristic and the amplitude characteristic of the bloodstream sound S before the blood vessel stenosis has occurred. Therefore, the determination unit 70 can determine the abnormality of the blood vessel 26 based on the first bloodstream sound S1 and the second bloodstream sound S2. The determination unit 70 may determine the abnormality of the blood vessel 26 based on the blood flow sounds S at three or more times.
The determination unit 70 may determine the abnormality of the blood vessel 26 based on the image 11 and the blood flow sound S. The determination unit 70 may determine that the blood vessel 26 is abnormal when it is determined that the blood vessel 26 is abnormal based on only the image 11 and the blood vessel 26 is abnormal based on only the bloodstream sound S. By determining the abnormality of the blood vessel 26 in this manner, the determination unit 70 can more accurately determine the abnormality of the blood vessel 26 than when the determination unit 70 determines the abnormality of the blood vessel 26 based on one of the image 11 and the blood flow sound S.
The determination unit 70 may determine that the blood vessel 26 is abnormal when it is determined that the blood vessel 26 is abnormal based on only the image 11 or the blood vessel 26 is abnormal based on only the bloodstream sound S. By determining the abnormality of the blood vessel 26 in this manner, the determination unit 70 can more widely determine the likelihood of the abnormality of the blood vessel 26 than when the determination unit 70 determines the abnormality of the blood vessel 26 based on one of the image 11 and the blood flow sound S.
The determination unit 70 may determine the abnormality of the blood vessel 26 based on the first image 11-1 and the second image 11-2, and the first bloodstream sound S1 and the second bloodstream sound S2. The determination unit 70 may determine that the blood vessel 26 is abnormal when it is determined that the blood vessel 26 is abnormal based on the position of the blood vessel 26 in the first image 11-1 and the position of the blood vessel 26 in the second image 11-2, and it is determined that the blood vessel 26 is abnormal based on at least one of the frequency characteristic and the amplitude characteristic of the first bloodstream sound S1 and at least one of the frequency characteristic and the amplitude characteristic of the second bloodstream sound S2. The determination unit 70 may determine that the blood vessel 26 is abnormal when it is determined that the blood vessel 26 is abnormal based on the position of the blood vessel 26 in the first image 11-1 and the position of the blood vessel 26 in the second image 11-2, or when it is determined that the blood vessel 26 is abnormal based on at least one of the frequency characteristic and the amplitude characteristic of the first bloodstream sound S1 and at least one of the frequency characteristic and the amplitude characteristic of the second bloodstream sound S2. The determination unit 70 may determine the abnormality of the blood vessel 26 based on the images 11 and the blood flow sound S at three or more times.
The image acquiring unit 30 may continuously acquire the images 11. The image acquiring unit 30 may acquire the images 11 continuously from the first time t1 to the second time t 2. When the image acquisition unit 30 continuously acquires the image 11, the image 11 is a moving image. When the image acquisition unit 30 continuously acquires the images 11, the position of the blood vessel 26 on the images 11 can be continuously changed.
The determination unit 70 may determine the abnormality of the blood vessel 26 based on the position of the blood vessel 26 that continuously changes on the image 11. The determination unit 70 may determine the abnormality of the blood vessel 26 based on the amount of change per unit time in the position of the blood vessel 26. The determination unit 70 may determine that the blood vessel 26 is abnormal when the amount of change per unit time in the position of the blood vessel 26 is equal to or greater than a predetermined amount of change.
The bloodstream sound acquiring unit 50 may continuously acquire the bloodstream sound S. The bloodstream sound acquiring unit 50 may acquire the bloodstream sound S continuously from the first time t1 to the second time t 2. When the blood flow sound acquiring unit 50 acquires the blood flow sound S continuously, at least one of the amplitude characteristic and the frequency characteristic of the blood flow sound S can be changed continuously.
The determination unit 70 may determine the abnormality of the blood vessel 26 based on the continuously changing blood flow sound S. The determination unit 70 may determine the abnormality of the blood vessel 26 based on at least one of the amplitude characteristic and the frequency characteristic of the sound wave of the blood flow sound S that continuously changes. The determination unit 70 may determine the abnormality of the blood vessel 26 based on at least one of the amount of change in amplitude and the amount of change in frequency per unit time in the continuously changing blood flow sound S.
The determination unit 70 may determine that the blood vessel 26 is abnormal when the amount of change in the amplitude per unit time in the blood flow sound S is equal to or greater than a predetermined amount of change. The determination unit 70 may determine that the blood vessel 26 is abnormal when the amount of change in the frequency per unit time in the blood flow sound S is equal to or greater than a predetermined amount of change.
Fig. 10 is a diagram illustrating an example of an image processing method according to an embodiment of the present invention. Fig. 10 shows an example of an image processing method in the image processing apparatus 100. Stage S100 is a stage of inputting information of the patient. The stage S100 may be a stage in which the information of the patient is input through the second input unit 40 (see fig. 1, 2, and 9). The information of the patient can be the age, sex, height, weight, past diseases and other information of the patient.
The stage S102 is a stage of starting the image acquisition unit 30. The stage S102 may be a stage in which the control unit 80 (see fig. 1, 2, and 9) activates the image acquisition unit 30.
Stage S104 is a stage in which the image acquisition unit 30 acquires the image 11. The image acquiring unit 30 may acquire the image 11 when the information of the patient is input through the second input unit 40. If the information of the patient is input in stage S100, the control part 80 may start the image obtaining part 30 in stage S102. In the case where the information of the patient is not input in the stage S100, the image acquisition section 30 may not acquire the image 11. If the information of the patient is not input in the stage S100, the control part 80 may not activate the image acquisition part 30 in the stage S102.
Stage S106 is a stage of storing the image 11 in the storage unit 60. The stage S106 may be a stage in which the control unit 80 transmits an instruction to store the image 11 to the storage unit 60 based on the acquisition of the image 11 by the image acquisition unit 30. In stage S106, the storage section 60 may store a plurality of images 11 (e.g., a first image 11-1 and a second image 11-2).
After stage S106, the image processing method of the present example proceeds to a stage of inputting at least one of the blood vessel position 27 and the examination position 29. In this example, a case where both the blood vessel position 27 and the examination position 29 are input will be described. In the image processing method of this example, either the blood vessel position 27 or the examination position 29 may be input. In the case where the blood vessel position 27 is input first, the image processing method of the present example proceeds to stage S108 after stage S106. In the case where the inspection position 29 is input first, the image processing method of this example proceeds to stage S112 after stage S106.
Stage S108 is a stage of inputting the vessel position 27. Stage S108 may be a stage in which the first input 20 inputs the blood vessel position 27. Stage S110 is a stage of inputting the inspection position 29. Stage S110 may be a stage of inputting the inspection position 29 through the first input part 20.
Stage S112 is a stage of inputting the inspection position 29. Stage S112 may be a stage of inputting the inspection position 29 through the first input part 20. Stage S114 is a stage of inputting the blood vessel position 27. Stage S114 may be a stage in which the first input portion 20 inputs the blood vessel position 27.
The image processing method of the present example proceeds to stage S116 after the stage of inputting at least one of the blood vessel position 27 and the examination position 29. Stage S116 is a stage of determining abnormality of the blood vessel 26. Stage S116 may be a stage in which the determination unit 70 determines an abnormality of the blood vessel 26.
Various embodiments of the present invention may be described with reference to flowchart illustrations and block diagrams. In various embodiments of the present invention, a block may represent (1) a stage of a process of performing an operation or (2) a section having a device that performs a role of the operation.
The particular stages may be performed by dedicated circuitry, programmable circuitry or a processor. The specific part may be installed by a dedicated circuit, a programmable circuit, or a processor. The programmable circuitry and the processor may be supplied with computer readable instructions. The computer readable instructions may be stored on a computer readable medium.
The dedicated circuitry may include at least one of digital hardware circuitry and analog hardware circuitry. The dedicated circuit may include at least one of an Integrated Circuit (IC) and a discrete circuit. The programmable circuit may comprise a hardware circuit of logical AND, logical OR, logical XOR, logical NAND, logical NOR, OR other logical operation. The programmable circuit may include a reconfigurable hardware circuit including memory elements such as flip-flops, registers, Field Programmable Gate Arrays (FPGAs), Programmable Logic Arrays (PLAs), and the like.
A computer-readable medium may comprise any tangible device capable of holding instructions for execution by a suitable device. The tangible apparatus is embodied by a computer readable medium having instructions stored therein, the computer readable medium having instructions stored therein that are executable to create a means for performing the operations specified in the flowchart or block diagram.
The computer readable medium may be, for example, an electronic storage medium, a magnetic storage medium, an optical storage medium, an electromagnetic storage medium, a semiconductor storage medium, or the like. The computer-readable medium may be, more specifically, for example, a floppy disk (registered trademark), a magnetic disk, a hard disk, a Random Access Memory (RAM), a Read Only Memory (ROM), an erasable programmable read only memory (EPROM or flash memory), an Electrically Erasable Programmable Read Only Memory (EEPROM), a Static Random Access Memory (SRAM), a compact disc read only memory (CD-ROM), a Digital Versatile Disk (DVD), a blu-ray disk (RTM), a memory stick, an integrated circuit card, or the like.
Computer-readable instructions may include any of assembly instructions, Instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state-setting data, source code, and object code. The source code and the object code may be described by any combination of one or more programming languages, including an object-oriented programming language and a past programming language. The object-oriented programming language may be, for example, Smalltalk, JAVA (registered trademark), C + +, or the like. The procedural programming language may be, for example, a "C" programming language.
The computer readable instructions may be provided to a processor or programmable circuitry of a general purpose computer, special purpose computer, or other programmable data processing apparatus, either locally or via a Wide Area Network (WAN), such as a Local Area Network (LAN), the internet, or the like. A processor or programmable circuitry of a general purpose computer, special purpose computer, or other programmable data processing apparatus may execute the computer readable instructions to create a means for performing the operations specified by the flowchart shown in fig. 10, or the block diagrams shown in fig. 2 and 9. The processor may be, for example, a computer processor, a processing unit, a microprocessor, a digital signal processor, a controller, a microcontroller, or the like.
Fig. 11 is a diagram showing an example of a computer 2200 that can embody the whole or a part of the image processing apparatus 100 of the present invention. The program installed in the computer 2200 can cause the computer 2200 to function as an operation associated with the image processing apparatus 100 according to the embodiment of the present invention or one or more parts of the image processing apparatus 100, can cause the computer 2200 to execute the operation or the one or more parts, or can cause the computer 2200 to execute each stage according to the image processing method of the present invention (see fig. 10). The program can be executed by the CPU 2212 to cause the computer 2200 to perform specific operations associated with several or all of the blocks in the flowcharts (fig. 10) and block diagrams (fig. 2 and 9) described in this specification.
The computer 2200 of this embodiment includes a CPU 2212, a RAM2214, a graphics controller 2216, and a display device 2218. The CPU 2212, the RAM2214, the graphic controller 2216, and the display device 2218 are connected to each other through the main controller 2210. The computer 2200 also includes input/output units such as a communication interface 2222, a hard disk drive 2224, a DVD-ROM drive 2226, and an IC card drive. The communication interface 2222, hard disk drive 2224, DVD-ROM drive 2226, IC card drive, and the like are connected to the main controller 2210 via the input/output controller 2220. The computer also includes conventional input and output units such as ROM 2230 and keyboard 2242. The ROM 2230, the keyboard 2242, and the like are connected to the input/output controller 2220 via the input/output chip 2240.
The CPU 2212 operates in accordance with programs stored in the ROM 2230 and the RAM2214, thereby controlling the respective units. The graphic controller 2216 acquires the image data generated by the CPU 2212 in a frame buffer or the like provided in the RAM2214 or the RAM2214, thereby displaying the image data on the display device 2218.
The ROM 2230 stores a boot program or the like executed by the computer 2200 when activated, or a program related to hardware of the computer 2200. The input/output chip 2240 may connect various input/output units to the input/output controller 2220 via a parallel port, a serial port, a keyboard port, a mouse port, and the like.
The program is provided through a computer-readable medium such as a DVD-ROM 2201 or an IC card. The program is read out from the computer-readable medium and installed to the hard disk drive 2224, the RAM2214, or the ROM 2230, which are also examples of the computer-readable medium, and executed by the CPU 2212. Information processing described in these programs is read into the computer 2200, and brings about cooperation between the programs and the various types of hardware resources described above. The apparatus or method may be constructed by implementing operations or processes for information in accordance with the use of the computer 2200.
For example, in the case where communication is performed between the computer 2200 and an external device, the CPU 2212 may execute a communication program loaded to the RAM2214, and instruct the communication interface 2222 to perform communication processing based on processing described in the communication program. The communication interface 2222 reads transmission data held in a transmission buffer processing area provided in a recording medium such as the RAM2214, the hard disk drive 2224, the DVD-ROM 2201, or the IC card, and transmits the read transmission data to the network, or writes reception data received from the network in a reception buffer processing area or the like provided on the recording medium, under the control of the CPU 2212.
The CPU 2212 can read all or necessary parts of files or databases stored in an external recording medium such as the hard disk drive 2224, the DVD-ROM drive 2226(DVD-RO M2201), an IC card, or the like, to the RAM 2214. The CPU 2212 can perform various types of processing on the data on the RAM 2214. The CPU 2212 may then write the processed data back to the external recording medium.
Various types of information such as various types of programs, data, tables, and databases may be stored in a recording medium and information processing may be performed. The CPU 2212 can perform various types of processing described in the present disclosure, including various types of operations specified by an instruction sequence of a program, information processing, condition judgment, conditional divergence, unconditional divergence, retrieval or replacement of information, and the like, on data read out from the RAM 2214. CPU 2212 can write the results back to RAM 2214.
The CPU 2212 can retrieve information in a file, a database, or the like in the recording medium. For example, when a plurality of entries each having an attribute value of a first attribute associated with an attribute value of a second attribute are stored in a recording medium, the CPU 2212 may retrieve an entry matching the condition, to which the attribute value of the first attribute is designated, from among the plurality of entries, and read the attribute value of the second attribute stored in the entry, thereby acquiring the attribute value of the second attribute associated with the first attribute that satisfies a predetermined condition.
The program or software module described above may be stored on the computer 2200 or in a computer-readable medium of the computer 2200. A recording medium such as a hard disk or a RAM provided in a server system connected to a dedicated communication network or the internet can be used as the computer-readable medium. The program can be provided to the computer 2200 through the recording medium.
The present invention has been described above using the embodiments, but the technical scope of the present invention is not limited to the scope described in the above embodiments. It will be apparent to those skilled in the art that various changes or modifications can be made in the above embodiments. It is clear from the description of the claims that such modifications and improvements can be included in the technical scope of the present invention.
The execution order of each process such as the operation, procedure, step, and stage in the apparatus, system, program, and method shown in the claims, description, and drawings is not particularly explicitly indicated as "before", "in advance", and the like, and it should be noted that the process can be realized in an arbitrary order unless the output of the previous process is used in the subsequent process. The operational flows in the claims, the specification, and the drawings are described using "first", "next", and the like for convenience, but do not necessarily mean that the operations are performed in this order.
Description of the reference numerals
10: an image display unit; 11: an image; 12: a front side; 14: a back side; 15: a protection part; 16: an electrostatic detection section; 17: a liquid crystal section; 18: a magnetic field detection unit; 20: a first input section; 22: a body; 24: a shunt tube; 26: a blood vessel; 27: the location of the blood vessel; 28: checking the position; 29: checking the position; 30: an image acquisition unit; 37: the location of the blood vessel; 39: checking the position; 40: a second input section; 50: a blood flow sound acquisition unit; 60: a storage unit; 70: a determination unit; 80: a control unit; 90: a main body portion; 100: an image processing device; 2200; a computer; 2201: a DVD-ROM; 2210: a main controller; 2212: a CPU; 2214: a RAM; 2216: a graphics controller; 2218: a display device; 2220: an input-output controller; 2222: a communication interface; 2224: a hard disk drive; 2226: a DVD-ROM drive; 2230: a ROM; 2240: an input-output chip; 2242: a keyboard.
Claims (11)
1. An image processing apparatus is characterized by comprising:
an image display unit that displays an image of at least a part of the body of the patient including the position where the shunt tube is provided; and
a first input unit configured to input, on the image displayed by the image display unit, position information on the image of at least one of a position of a blood vessel in the body of the patient and an examination position at which the blood vessel is examined.
2. The image processing apparatus according to claim 1,
the first input unit inputs the position information of at least one of the position of the blood vessel and the inspection position on the image by an electrical or electromagnetic connection between the first input unit and the image display unit.
3. The image processing apparatus according to claim 1 or 2,
the image acquisition unit is also provided for acquiring the image.
4. The image processing apparatus according to claim 3,
further comprises a second input unit for inputting information of the patient,
wherein the image acquiring unit acquires the image when the information of the patient is input through the second input unit.
5. The image processing apparatus according to any one of claims 1 to 4,
the image processing apparatus further includes a storage unit that stores the position information input by the first input unit in association with the image.
6. The image processing apparatus according to claim 5,
the blood vessel monitoring device further includes a determination unit configured to determine an abnormality of the blood vessel based on the image.
7. The image processing apparatus according to claim 6,
the storage unit stores a first image in which at least one of the position of the blood vessel and the inspection position at a first time is depicted, and a second image in which at least one of the position of the blood vessel and the inspection position at a second time after a predetermined time has elapsed from the first time is depicted,
the determination unit determines an abnormality of the blood vessel based on the first image and the second image.
8. The image processing apparatus according to claim 6 or 7,
the storage unit further stores blood flow sounds in the blood vessel,
the determination unit determines an abnormality of the blood vessel based on the image and the bloodstream sound.
9. The image processing apparatus according to claim 8,
the storage unit further stores a first bloodstream sound which is the bloodstream sound at a first time, and a second bloodstream sound which is the bloodstream sound at a second time after a predetermined time has elapsed from the first time,
the determination unit determines an abnormality of the blood vessel based on the first bloodstream sound and the second bloodstream sound.
10. The image processing apparatus according to claim 8 or 9,
the blood flow sound acquisition unit is further provided for acquiring the blood flow sound.
11. An image processing program characterized by comprising,
for causing a computer to function as an image processing apparatus as claimed in any one of claims 1 to 10.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2020-191167 | 2020-11-17 | ||
JP2020191167A JP2022080154A (en) | 2020-11-17 | 2020-11-17 | Image processing device and image processing program |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114511490A true CN114511490A (en) | 2022-05-17 |
Family
ID=81548524
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111356494.3A Pending CN114511490A (en) | 2020-11-17 | 2021-11-16 | Image processing apparatus and image processing program |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP2022080154A (en) |
CN (1) | CN114511490A (en) |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5690115A (en) * | 1995-09-21 | 1997-11-25 | Feldman; Charles L. | Detecting vascular stenosis in chronic hemodialysis patients |
JPH1133110A (en) * | 1997-07-17 | 1999-02-09 | Nikon Corp | Kidney dialysis management system and computer-readable recording medium with recorded kidney dialysis management program |
JP2009254678A (en) * | 2008-04-18 | 2009-11-05 | Chuo Electronics Co Ltd | Shunt status detector |
JP2014008263A (en) * | 2012-06-29 | 2014-01-20 | Univ Of Yamanashi | Shunt constriction diagnostic support system and method, array shaped sound collection sensor device, and successive segmentation self organized map forming device, method and program |
US20140088427A1 (en) * | 2012-09-25 | 2014-03-27 | Fuji Film Corporation | Ultrasound diagnostic apparatus |
JP2016087307A (en) * | 2014-11-10 | 2016-05-23 | 国立大学法人群馬大学 | Imaging support device and dialysis apparatus |
US20160314601A1 (en) * | 2015-04-21 | 2016-10-27 | Heartflow, Inc. | Systems and methods for risk assessment and treatment planning of arterio-venous malformation |
JP2017060624A (en) * | 2015-09-25 | 2017-03-30 | テルモ株式会社 | Shunt blood vessel detection device |
US20170196478A1 (en) * | 2014-06-25 | 2017-07-13 | Canary Medical Inc. | Devices, systems and methods for using and monitoring tubes in body passageways |
JP2018202042A (en) * | 2017-06-08 | 2018-12-27 | 株式会社テクノサイエンス | Puncture system, puncture control device, and puncture needle |
CN111656402A (en) * | 2017-12-20 | 2020-09-11 | 皇家飞利浦有限公司 | Device, system and method for interacting with blood vessel images |
-
2020
- 2020-11-17 JP JP2020191167A patent/JP2022080154A/en active Pending
-
2021
- 2021-11-16 CN CN202111356494.3A patent/CN114511490A/en active Pending
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5690115A (en) * | 1995-09-21 | 1997-11-25 | Feldman; Charles L. | Detecting vascular stenosis in chronic hemodialysis patients |
JPH1133110A (en) * | 1997-07-17 | 1999-02-09 | Nikon Corp | Kidney dialysis management system and computer-readable recording medium with recorded kidney dialysis management program |
JP2009254678A (en) * | 2008-04-18 | 2009-11-05 | Chuo Electronics Co Ltd | Shunt status detector |
JP2014008263A (en) * | 2012-06-29 | 2014-01-20 | Univ Of Yamanashi | Shunt constriction diagnostic support system and method, array shaped sound collection sensor device, and successive segmentation self organized map forming device, method and program |
US20140088427A1 (en) * | 2012-09-25 | 2014-03-27 | Fuji Film Corporation | Ultrasound diagnostic apparatus |
US20170196478A1 (en) * | 2014-06-25 | 2017-07-13 | Canary Medical Inc. | Devices, systems and methods for using and monitoring tubes in body passageways |
JP2016087307A (en) * | 2014-11-10 | 2016-05-23 | 国立大学法人群馬大学 | Imaging support device and dialysis apparatus |
US20160314601A1 (en) * | 2015-04-21 | 2016-10-27 | Heartflow, Inc. | Systems and methods for risk assessment and treatment planning of arterio-venous malformation |
JP2017060624A (en) * | 2015-09-25 | 2017-03-30 | テルモ株式会社 | Shunt blood vessel detection device |
JP2018202042A (en) * | 2017-06-08 | 2018-12-27 | 株式会社テクノサイエンス | Puncture system, puncture control device, and puncture needle |
CN111656402A (en) * | 2017-12-20 | 2020-09-11 | 皇家飞利浦有限公司 | Device, system and method for interacting with blood vessel images |
Non-Patent Citations (2)
Title |
---|
殷慧康;夏云宝;吴琼;: "头臂静脉CTV在动静脉内瘘术前评估中的应用价值", 医疗卫生装备, no. 12, 15 December 2016 (2016-12-15) * |
马丽萍!410011 , 周启昌!410011: "高分辨率超声无创性检测血管内皮功能", 中国超声医学杂志, no. 09, 5 September 1999 (1999-09-05) * |
Also Published As
Publication number | Publication date |
---|---|
JP2022080154A (en) | 2022-05-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9144407B2 (en) | Image processing device and method, and program | |
US8099304B2 (en) | System and user interface for processing patient medical data | |
RU2008106247A (en) | AUTOMATED SYSTEM OF COLLECTION AND ARCHIVING OF INFORMATION FOR VERIFICATION OF MEDICAL NEED FOR PERFORMANCE OF MEDICAL PROCEDURE | |
US20110144918A1 (en) | Biological information display device, biological information display system, statistical processing method, and recording medium recording thereon statistical processing program | |
US7418120B2 (en) | Method and system for structuring dynamic data | |
CN113902642B (en) | Medical image processing method and device, electronic equipment and storage medium | |
US20200234828A1 (en) | System, apparatus, method, and graphical user interface for screening | |
CN1983258A (en) | System and user interface for processing patient medical data | |
AU2024205829A1 (en) | Medical care assistance device, and operation method and operation program therefor | |
US20170243348A1 (en) | Assistance apparatus for assisting interpretation report creation and method for controlling the same | |
CN114511490A (en) | Image processing apparatus and image processing program | |
CN109599189B (en) | Method, device, electronic equipment and storage medium for monitoring abnormal medication response | |
US20230230246A1 (en) | Information processing system, information processing method, and information processing program | |
WO2018073707A1 (en) | System and method for workflow-sensitive structured finding object (sfo) recommendation for clinical care continuum | |
CN114504338A (en) | Heartbeat information acquisition device and heartbeat information acquisition program | |
CN111192681A (en) | Method and system for acquiring target blood glucose characteristics | |
JP2020057035A (en) | Supporting device, display system, and method for support | |
CN113610841B (en) | Blood vessel abnormal image identification method and device, electronic equipment and storage medium | |
CN114680932A (en) | Display method for medical device and medical device | |
US20160034646A1 (en) | Systems and methods for electronic medical charting | |
US20030204411A1 (en) | Medical security system | |
JP2022099055A (en) | Medical information display device and medical information display system | |
WO2019208654A1 (en) | Endoscopic examination information analysis device, endoscopic examination information input supporting system, and endoscopic examination information analysis method | |
US20160292360A1 (en) | Method and system for patient identification when obtaining medical images | |
EP2716225A1 (en) | Image processing apparatus and method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |