WO2017222283A1 - Method for analyzing health condition on basis of image photographing and providing health condition information - Google Patents

Method for analyzing health condition on basis of image photographing and providing health condition information Download PDF

Info

Publication number
WO2017222283A1
WO2017222283A1 PCT/KR2017/006478 KR2017006478W WO2017222283A1 WO 2017222283 A1 WO2017222283 A1 WO 2017222283A1 KR 2017006478 W KR2017006478 W KR 2017006478W WO 2017222283 A1 WO2017222283 A1 WO 2017222283A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
method
bone
fat
calculating
Prior art date
Application number
PCT/KR2017/006478
Other languages
French (fr)
Korean (ko)
Inventor
원영준
Original Assignee
가톨릭관동대학교 산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to KR20160077561 priority Critical
Priority to KR10-2016-0077561 priority
Application filed by 가톨릭관동대학교 산학협력단 filed Critical 가톨릭관동대학교 산학협력단
Publication of WO2017222283A1 publication Critical patent/WO2017222283A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Detecting, measuring or recording for diagnostic purposes; Identification of persons
    • A61B5/05Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radiowaves
    • A61B5/055Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radiowaves involving electronic [EMR] or nuclear [NMR] magnetic resonance, e.g. magnetic resonance imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/02Devices for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computerised tomographs

Abstract

The present invention relates to a method for analyzing a health condition and providing health condition information and, particularly, to a method for analyzing a health condition on the basis of an image taken by a computer or a magnetic resonance imaging apparatus and providing health condition information, the method comprising: a step of receiving the image taken of a body; a step of extracting an image for a specific body part from the image taken; a step of calculating bone, fat and muscle percentages from the extracted image; and a step of extracting and displaying, from a diagnostic table in which health condition information according to the percentages is stored, the health condition information for the calculated percentages. The method for analyzing a health condition and providing health condition information of the present invention as described above allows for the diagnosis of a health condition on the basis of bone, fat and muscle percentages actually distributed in the body, thereby improving accuracy in diagnosis as compared to the conventional art.

Description

How to analyze health status based on imaging and provide health status information

The present invention relates to a method for analyzing health status and providing health status information, and more particularly, to a method for analyzing health status and providing health status information based on an image photographed by a computer or a magnetic resonance imaging apparatus.

In the past, it was common to measure the ratio of fat and muscle to diagnose a health condition, and related diagnostic techniques include bioelectrical impedance diagnosis and ultrasound diagnosis. The bioelectrical impedance diagnosis method is a method of estimating the ratio of fat and muscle as a resistance value, which is problematic in that the accuracy of diagnosis is inferior when a confusion element such as metal is attached to a user's body. In addition, the ultrasonic diagnostic method is a method of imaging the ultrasonic wave is reflected back by propagating the ultrasonic waves inside the body, which is limited in the diagnosis site because the delivery rate of the ultrasonic waves in the organ where the air is present.

In the diagnosis of health status, the percentage of fat and muscle as well as the percentage of bone is an important diagnostic factor. However, the bioelectrical impedance diagnosis method and the ultrasonic diagnosis method cannot measure the area or volume of the bone, and thus the health condition cannot be diagnosed by including the ratio of the bone as a diagnostic element.

Dual energy radiation absorptiometry is a method of measuring bone, fat, and muscle, which is less correlated with central bone density and results for bone mineral density, fat, and muscle mass throughout the body. Cannot be measured.

Therefore, there is a need for a method of analyzing health status and providing health status information in which a diagnosis of a health condition is improved by a relationship with bones, fats, and muscles, thereby improving the accuracy of diagnosis.

An object of the present invention is to reflect the needs of the user as described above, to provide a health state analysis and health state information providing method of improving the accuracy of the diagnosis by diagnosing the state of health in relation to bone, fat and muscle.

According to an exemplary embodiment of the present invention, a method for analyzing a state of health based on image capturing and providing health state information may include receiving a captured image of a body; Extracting an image of a specific body part from the captured image; Calculating a ratio of bone, fat and muscle in the extracted image; And extracting and displaying the health state information for the calculated ratio from the diagnosis table in which the health state information according to the ratio is stored.

Receiving a photographed image in the present invention; may be a step of receiving a photographed image taken from a CT (Computed Tomography), MRI (Magnetic Resonance Image) or X-ray (X-Ray).

Extracting an image in the present invention; The step of matching the captured image to a specific reference point of the pre-stored specific body region image; Resizing the captured image such that the body outline displayed on the captured image matches the body outline displayed on the specific body image; And extracting an image of a region overlapping the specific body part image from the captured image.

In the present invention, the pre-stored specific body part image may be a standardized image of a rectangular shape which defines a specific body part as an upper limit and a lower limit, and connects an upper limit and a lower limit.

Extracting an image from the present invention; Thereafter, the method may further include specifying an analysis target region in the extracted image.

Receiving a photographed image in the present invention; may be a step of receiving a photographed image photographed from the front of the user's body.

Calculating a ratio of bone, fat, and muscle in the extracted image of the present invention; calculating an area by detecting each area with an attenuation value corresponding to bone, fat, and muscle; And calculating a ratio of each area.

Receiving a photographed image in the present invention; may be a step of receiving a photographed image photographed from the front, side, top of the user's body.

Calculating a ratio of bones, fats and muscles in the extracted image of the present invention; calculating a volume by detecting each region with an attenuation value corresponding to bones, fats and muscles; And calculating a ratio of each volume.

Extracting and displaying the health state information in the present invention; may be the step of displaying the health state information converted into a score.

As described above, by using the health state analysis and health state information providing method of the present invention, it is possible to diagnose the state of health at a rate occupied by bones, fats, and muscles that are actually distributed in the body, which is more accurate than conventional techniques. Is improved.

1 is a flow chart illustrating a health state analysis and health state information providing method according to an embodiment of the present invention.

2 is a flowchart illustrating a method of extracting an image of a specific body part of a health state analysis and health state information providing method according to an exemplary embodiment of the present invention.

3 is a flowchart illustrating a method of calculating a ratio of bone, fat, and muscle in a health state analysis and health state information providing method according to an exemplary embodiment of the present invention.

4 is a flowchart illustrating a method of calculating a ratio of bone, fat, and muscle in a health state analysis and health state information providing method according to another exemplary embodiment of the present invention.

5 is an exemplary diagram of a screen for implementing a method of extracting an image of a specific body part from a captured image according to an exemplary embodiment of the present invention.

6 is an exemplary view of a screen for implementing a method for calculating the area ratio of bone, fat, and muscle according to an embodiment of the present invention.

7 is an exemplary view of a screen for implementing a method for calculating a volume ratio of bone, fat, and muscle according to another embodiment of the present invention.

The health state analysis and health state information providing method may include receiving a captured image of a body, extracting an image of a specific body part from the captured image, and a ratio of bone, fat, and muscle in the extracted image. Calculating and

And extracting and displaying health state information on the calculated ratio from a diagnosis table in which health state information according to a ratio is stored.

DETAILED DESCRIPTION Hereinafter, exemplary embodiments of the present invention will be described in detail with reference to the accompanying drawings so that those skilled in the art may easily implement the present invention. As those skilled in the art would realize, the described embodiments may be modified in various different ways, all without departing from the spirit or scope of the present invention. In the drawings, parts irrelevant to the description are omitted in order to clearly describe the present invention, and like reference numerals designate like parts throughout the specification.

1 is a flow chart illustrating a health state analysis and health state information providing method according to an embodiment of the present invention. In the imaging method of the present invention based on the imaging state of the present invention as shown in FIG. 1 and the method for providing health state information, receiving a captured image of the body (S100), extracting an image for a specific body part from the captured image (S200), calculating the ratio of bone, fat and muscle in the extracted image (S300), extracting and displaying the health state information for the ratio calculated from the diagnosis table in which the health state information according to the ratio is stored (S400). ).

Receiving the captured image (S100) is a step of receiving an image of the body of the test subject. In this case, the image of the body of the test subject may be an image captured by CT (Computed Tomography), MRI (Magnetic Resonance Image) or X-ray (X-ray). When a patient's body is taken with computed tomography (MCT), magnetic resonance image (MRI), or x-rays (X-rays), bone, fat, and muscles inside the body are attenuated by X-rays (hereinafter referred to as , Brightness (hereinafter, referred to as illuminance value) is differently displayed according to the attenuation value, so that the areas where bone, fat, and muscle are distributed can be easily distinguished. The image of the body of the test subject may be an image of the body of the test subject, or an image of the body of the test subject. Although images of the subject's body taken from one side of the body can be diagnosed in a relatively short time with the area of bone, fat and muscle, the subject's body is taken from three sides to diagnose the state of health with the volume of bone, fat and muscle. The accuracy of the diagnosis may be less than that.

When the step (S100) of receiving the captured image is completed, the process of extracting an image of a specific body part from the captured image (S200) is performed. Extracting an image of the specific body part (S200) is a step of extracting only an image of a specific body part to be examined from the captured image, which will be described in detail with reference to FIG. 2.

2 is a flowchart illustrating a method of extracting an image of a specific body part of a health state analysis and health state information providing method according to an exemplary embodiment of the present invention. Extracting an image of a specific body part from the captured image (S200) may include matching the captured image with a specific reference point of a pre-stored specific body part image (S201). Resizing the captured image to match the unit (S202) and extracting an image of a region overlapped with a specific body region image from the captured image (S203).

Matching the specific reference point (S201) is a step of superimposing a specific body part image on the captured image. Since the two images overlap with respect to a specific reference point, the position of a specific body part in the captured image including a region other than the specific body part may be identified as the position where the specific body part image is overlapped. In this case, when a specific reference point is a specific fat or a specific muscle, the formation position may vary depending on the health state of the test subject, and therefore, it is preferable that the formation position is a specific bone or a specific organ.

The specific body region image is a standardized image of a rectangular form connecting an upper limit and a lower limit by defining an uppermost end of a specific body part as an upper limit and a lower end as a lower limit. At this time, it is preferable that the line connecting the upper limit and the lower limit is located outside the outer part of the body so that it is not formed across the body part. For example, if a particular body part is the upper thigh, the uppermost end of the femoral ball head may be defined as the upper limit and the lowermost end of the small electron may be defined.

In the step of resizing (S202), when the two images are matched with a specific reference point and the image of the overlapping area is extracted immediately, the large body is extracted from the small part and the small body is extracted from the large part. Since the body outline displayed on the captured image does not exist, the size of the captured image is reduced or increased to match the extraction range of the two images so that the body outline displayed on the specific body region image is matched.

Extracting an image of the overlapped region (S203) is extracting only a specific body part image from the captured image. Although not shown in FIG. 2, according to an embodiment of the present invention, the method may further include designating an analysis target region in the extracted image after the extracting of the image (S203). For example, in the extracted image, the outline of the body and the outline of the bone may be determined, and only an area surrounding the bone may be designated as an analysis target region.

The step (S200) of extracting an image of a specific body part from the captured image will be described with reference to FIG. 5, which is an exemplary view of a screen for implementing the same. FIG. 5A illustrates a specific reference point 12 as the hip joint. The superimposed image of the captured image 10 and the specific body part 11 around the hip joint are illustrated. 5B illustrates a resizing of the captured image 10 so that the body outline of the specific body image 11 and the body outline of the captured image 10 correspond to each other, and from the captured image 10, a specific body image ( 11 shows only the image of the overlapped region. In this case, the analysis target region 13 may be designated in the extracted image, and when the analysis target region 13 is designated as a region surrounding the bone, as shown in FIG. Except for the region, only the region around the femur may be designated as the analysis target region 13.

When the step (S200) of extracting an image of a specific body part from the captured image is completed, the step (S300) of calculating a ratio of bone, muscle, and fat is performed. The calculating of the ratio (S300) is a step for determining the health state of a specific body part by using the ratio of bone, fat and muscle in the body of the test subject, which will be described in detail with reference to FIGS.

3 is a flowchart illustrating a method of calculating a ratio of bone, fat, and muscle in a health state analysis and health state information providing method according to an exemplary embodiment of the present invention. Calculating the ratio of bone, fat and muscle in the extracted image (S300) includes calculating a bone area (S301), calculating a fat area (S302), calculating a muscle area (S303), and overall Comprising a step (S304) of calculating the ratio of each area of the area (bone area + fat area + muscle area).

 At this time, the step of calculating the bone area in the extracted image (S301) is a step of finding and classifying a region corresponding to the attenuation value corresponding to the bone 200 to 900 HU (Hounsfield Unit) by the roughness value and obtaining the area of the region. The step of calculating the fat area (S302) is a step of finding and classifying an area corresponding to -30 to -190 Hounsfield Unit (HU), which is an attenuation value corresponding to fat, by the illuminance value, and obtaining an area of the area. Computing the muscle area (S303) is a step of finding and classifying an area corresponding to 30 to 70 HU (Hounsfield Unit), which is attenuation value corresponding to the muscle, by dividing the illuminance value and obtaining an area of the area.

Calculating the ratio of each area of the total area (S304) is to calculate the ratio of the area of each of the bones, fats and muscles calculated in the step S301 to S303 in the total area (bone area + fat area + muscle area) Step.

The step (S300) of calculating the ratio of bone, fat, and muscle in the extracted image will be described with reference to FIG. 6, which is an exemplary view of a screen for implementing the same, and FIG. 6 shows 200 to 900 HUs in the extracted image. Find the bone area corresponding to the attenuation value of) as a roughness value and display it as a closed curve, and find the fat area corresponding to the attenuation value of -30 to -190 HU (Hounsfield Unit) as the intensity value and display it as a closed curve, 30 to 70 The figure shows the muscle area illuminance value corresponding to the attenuation value of the Hounsfield Unit (HU) and indicated by the closed curve. After displaying each area with a closed curve as described above, the area occupied by the area is calculated and the ratio of each area is calculated based on the total area (bone area + fat area + muscle area).

4 is a flowchart illustrating a method of calculating a ratio of bone, fat, and muscle in a health state analysis and health state information providing method according to another exemplary embodiment of the present invention. Calculating the ratio of bone, fat and muscle in the extracted image (S300) is a step of calculating the volume ratio of bone, fat and muscle to determine the health state, calculating the bone volume (S311), fat Comprising a step of calculating the volume (S312), a step of calculating the muscle volume (S313) and the step of giving a ratio of each volume of the total volume (bone volume + fat volume + muscle volume) (S314). In this case, in order to calculate the volume ratio of bone, fat, and muscle, a plurality of captured images taken from the front, side, and top of the body of the test subject must be received in step S100 of receiving the captured image.

The step of calculating the bone volume from the extracted image is to find and classify the area corresponding to the attenuation value 200 to 900 HU (Hounsfield Unit), which is the attenuation value corresponding to the bone, in each photographed image, and to sum the bone regions of each photographed image. Calculate the bone volume.

In calculating the fat volume (S312), the area corresponding to the attenuation value -30 to -190 HU (Hounsfield Unit) corresponding to the fat in each photographed image is found by dividing the illumination value, and the fat region of each photographed image is summed. It is a step of calculating the volume of fat.

In calculating the muscle volume (S313), the area corresponding to the attenuation value 30 to 70 HU (Hounsfield Unit) corresponding to the muscle in each photographed image is found by dividing the illuminance value, and the muscle area of each photographed image is added to the muscle. Calculating the volume of.

Calculating the ratio of each volume of the total volume (S314) is to calculate the ratio of the volume of each of the bones, fats and muscles calculated in the steps S311 to S313 in the total volume (bone volume + fat volume + muscle volume) Step.

Calculating the ratio of bone, fat and muscle in the extracted image (S300) will be described with reference to FIG. 7 which is an exemplary view of a screen for implementing the same. On the other hand, the images taken from the side and the top should be received. 7 (a) is an image of the body of the test subject taken from the upper surface, (b) is an image of the body of the test subject taken from the front, (C) is an image of the body of the test subject taken from the side.

As shown in FIG. 7, when the examination is to be performed on the four lumbar vertebrae 14, 15, 16, and 17, the images taken from the front and side surfaces include four lumbar vertebrae 14, 15, 16, and 17 in one shot. However, in the case of an image taken from the upper surface, since the second lumbar spine 15 to the fourth lumbar spine 17 are covered by the first lumbar spine 14, only one lumbar spine may be included in one image, so that each lumbar spine is separately. You need to get a video. 7 is an image of the subject's body taken from the front, side, and top surface and extracted from the first lumbar spine 14 to the fourth lumbar spine 17, and attenuation values corresponding to bones in all extracted images are 200 to 900 HU. The figure shows the bone area of the Hounsfield Unit with the roughness value and displayed as a closed curve. As described above, each area is displayed as a closed curve in all the extracted images, and then the displayed areas are summed to calculate the bone volume. Afterwards, the muscles and fats are searched for corresponding illuminance values and displayed as closed curves, and the displayed areas are summed to calculate the respective volume, and the proportion of each area based on the total volume (bone volume + fat volume + muscle volume). Obtain

When the step (S300) of calculating the ratio of bone, muscle, and fat is completed, step (S400) of extracting and displaying health state information on the ratio calculated from the diagnosis table in which health state information according to the ratio is stored is performed. Extracting and displaying the health state information (S400) is a step of determining a health state with a ratio of bones, fats and muscles calculated in the step (S300), and the ratio of bones, fats and muscles. Extracts and displays only the health status information according to the ratio calculated from the diagnosis table in which the health status information is stored. According to the exemplary embodiment of the present invention, the health state information may be provided to the test subject by scoring.

By using the health state analysis and the health state information providing method of the present invention, it is possible to diagnose the state of health at the proportion of bones, fats, and muscles that are actually distributed in the body, thereby improving the accuracy of diagnosis compared to the prior art.

On the other hand, the embodiments of the present invention disclosed in the specification and drawings are merely presented specific examples to easily explain the technical contents of the present invention and help the understanding of the present invention, and are not intended to limit the scope of the present invention. It will be apparent to those skilled in the art that other modifications based on the technical idea of the present invention can be carried out in addition to the embodiments disclosed herein.

Claims (10)

  1. Receiving a captured image of the body;
    Extracting an image of a specific body part from the captured image;
    Calculating a ratio of bone, fat and muscle in the extracted image; And
    And extracting and displaying health state information on the calculated ratio from a diagnosis table in which health state information according to a ratio is stored.
  2. The method of claim 1,
    Receiving the captured image;
    Imaging-based health status analysis and health information providing method comprising the step of receiving a photographed image taken from a CT (Computed Tomography), MRI (Magnetic Resonance Image) or X-ray (X-Ray).
  3. The method of claim 1,
    Extracting the image;
    Matching the captured image to a specific reference point of a previously stored specific body part image;
    Resizing the captured image such that the body outline displayed on the captured image matches the body outline displayed on the specific body image; And
    And extracting an image of a region overlapping with the specific body part image of the photographed image.
  4. The method of claim 3,
    The pre-stored specific body part image
    Imaging-based health state analysis and health state information providing method, characterized in that the uppermost end of a specific body part is defined as the upper limit and the lowermost end is the lower limit, and the upper limit and the lower limit are standardized images.
  5. The method of claim 1,
    Extracting the image; after,
    Specifying an analysis target region in the extracted image; imaging method based health status analysis and health status information providing method further comprising a.
  6. The method of claim 1,
    Receiving the captured image;
    Receiving an image of the user's body taken from the front; Image-based health status analysis and health status information providing method comprising the.
  7. The method of claim 6,
    Calculating a ratio of bone, fat and muscle in the extracted image;
    Calculating an area by detecting each area with an attenuation value corresponding to bone, fat, and muscle; And
    Comprising a step of calculating the ratio of each area; imaging method based health status analysis and health status information providing method comprising a.
  8. The method of claim 1,
    Receiving the captured image;
    Receiving an image taken from the front, side, top of the user's body; Image-based health status analysis and health status information providing method comprising the.
  9. The method of claim 8,
    Calculating a ratio of bone, fat and muscle in the extracted image;
    Calculating the volume by detecting each region with an attenuation value corresponding to bone, fat and muscle; And
    Comprising a step of calculating the ratio of each volume; imaging method based on health analysis and health state information providing method comprising a.
  10. The method of claim 1,
    Extracting and displaying the health state information;
    And converting the health status information into scores and displaying the converted health status information.
PCT/KR2017/006478 2016-06-21 2017-06-20 Method for analyzing health condition on basis of image photographing and providing health condition information WO2017222283A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
KR20160077561 2016-06-21
KR10-2016-0077561 2016-06-21

Publications (1)

Publication Number Publication Date
WO2017222283A1 true WO2017222283A1 (en) 2017-12-28

Family

ID=60784166

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2017/006478 WO2017222283A1 (en) 2016-06-21 2017-06-20 Method for analyzing health condition on basis of image photographing and providing health condition information

Country Status (1)

Country Link
WO (1) WO2017222283A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20030018792A (en) * 2001-08-31 2003-03-06 버츄얼아이테크 주식회사 A medical image processing system and the method thereof
KR20030081340A (en) * 2000-11-29 2003-10-17 가부시키가이샤 아트헤븐나인 Method and device for measuring body compositions
JP2008142532A (en) * 2007-11-14 2008-06-26 Hitachi Medical Corp Medical image diagnosis support system
KR20140144645A (en) * 2013-06-11 2014-12-19 삼성전자주식회사 The method and apparatus otaining a image related to region of interest of a target object
JP2015142619A (en) * 2014-01-31 2015-08-06 セイコーエプソン株式会社 Processing system, ultrasonic measurement apparatus, program and ultrasonic measurement method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20030081340A (en) * 2000-11-29 2003-10-17 가부시키가이샤 아트헤븐나인 Method and device for measuring body compositions
KR20030018792A (en) * 2001-08-31 2003-03-06 버츄얼아이테크 주식회사 A medical image processing system and the method thereof
JP2008142532A (en) * 2007-11-14 2008-06-26 Hitachi Medical Corp Medical image diagnosis support system
KR20140144645A (en) * 2013-06-11 2014-12-19 삼성전자주식회사 The method and apparatus otaining a image related to region of interest of a target object
JP2015142619A (en) * 2014-01-31 2015-08-06 セイコーエプソン株式会社 Processing system, ultrasonic measurement apparatus, program and ultrasonic measurement method

Similar Documents

Publication Publication Date Title
Uppaluri et al. Quantification of pulmonary emphysema from lung computed tomography images
Yaffe Mammographic density. Measurement of mammographic density
Săftoiu et al. Accuracy of endoscopic ultrasound elastography used for differential diagnosis of focal pancreatic masses: a multicenter study
Kundel et al. Holistic component of image perception in mammogram interpretation: gaze-tracking study
NL1024869C2 (en) Method and system for measuring tissue changes relevant to disease.
Hoffman et al. Characterization of the interstitial lung diseases via density-based and texture-based analysis of computed tomography images of lung structure and function1
Fox et al. Diagnostic performance of CT, MPR and 3DCT imaging in maxillofacial trauma
Yoon et al. Coronary artery calcium: alternate methods for accurate and reproducible quantitation
JP4484462B2 (en) Method and apparatus for positioning a patient in a medical diagnostic or therapeutic device
US9033576B2 (en) Medical imaging system for accurate measurement evaluation of changes
JP2010517632A (en) System for continuous guidance of endoscope
CN101443816B (en) For the deformable registration of images of image guided radiation therapy
CN105025799B (en) Three-dimensional mapping display system for diagnostic ultrasound machine
JP2004000609A (en) Computer assisted diagnosis by multiple energy image
Sharp et al. Computer‐based methods for measuring joint space and estimating erosion volume in the finger and wrist joints of patients with rheumatoid arthritis
Robinson Radiology's Achilles' heel: error and variation in the interpretation of the Röntgen image.
US7204807B2 (en) Joint analysis using ultrasound
JP2006014928A (en) Method, device and program for displaying image
JP2007307372A (en) Ultrasound system for displaying fusion image of ultrasound image and external medical image
US8334878B2 (en) Medical image processing apparatus and medical image processing program
US7274811B2 (en) Method and apparatus for synchronizing corresponding landmarks among a plurality of images
CN102208105B (en) Medical Image Processing
CN104188682B (en) Konica minolta medical & graphics, inc.
US7574032B2 (en) Method and apparatus for virtual subtraction of stool from registration and shape based analysis of prone and supine scans of the colon
Klauser et al. Finger injuries in extreme rock climbers

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17815696

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase in:

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 03.04.2019)

122 Ep: pct application non-entry in european phase

Ref document number: 17815696

Country of ref document: EP

Kind code of ref document: A1