WO2023282389A1 - Procédé de calcul de masse grasse utilisant une image de tête et de cou et dispositif associé - Google Patents

Procédé de calcul de masse grasse utilisant une image de tête et de cou et dispositif associé Download PDF

Info

Publication number
WO2023282389A1
WO2023282389A1 PCT/KR2021/013299 KR2021013299W WO2023282389A1 WO 2023282389 A1 WO2023282389 A1 WO 2023282389A1 KR 2021013299 W KR2021013299 W KR 2021013299W WO 2023282389 A1 WO2023282389 A1 WO 2023282389A1
Authority
WO
WIPO (PCT)
Prior art keywords
image information
fat
region
head
image
Prior art date
Application number
PCT/KR2021/013299
Other languages
English (en)
Korean (ko)
Inventor
기리시스리니바산
김한석
유영성
피재우
로히스하리다스
니키타토마스
아킬라페루말라
Original Assignee
주식회사 피노맥스
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 피노맥스 filed Critical 주식회사 피노맥스
Publication of WO2023282389A1 publication Critical patent/WO2023282389A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4869Determining body composition
    • A61B5/4872Body fat
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0033Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/05Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves 
    • A61B5/055Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves  involving electronic [EMR] or nuclear [NMR] magnetic resonance, e.g. magnetic resonance imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7271Specific aspects of physiological measurement analysis
    • A61B5/7275Determining trends in physiological measurement data; Predicting development of a medical condition based on physiological measurements, e.g. determining a risk factor
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/30ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment

Definitions

  • the technical idea of the present disclosure relates to a fat amount calculation method using head and neck video images and an apparatus therefor.
  • the bioelectrical impedance method is a method of measuring a resistance value after flowing a microcurrent through the body, and has a disadvantage in that the accuracy is low and the measured value varies greatly depending on the subject's water intake.
  • the bone density measurement method is a method of measuring the transmittance through which radiation penetrates the human body, and there is a risk of exposure when imaging frequently due to radiation. Therefore, a technology for more safely and accurately measuring body composition is required.
  • a fat mass calculation method and apparatus for calculating fat mass are to accurately and quickly detect a fat region from an MR image, and calculate a fat mass in the head and neck region from the fat mass calculation method and apparatus therefor is to provide
  • a fat mass calculation method includes obtaining an image group including a plurality of two-dimensional images generated corresponding to the continuous volume of the head and neck of a user; generating first image information in which at least a part of a brain region is removed from the image group by inputting the image group to a first learned network function; generating second image information obtained by detecting a fat region from the first image information by inputting the first image information to a learned second network function; and calculating the amount of fat in the head and neck from the second image information.
  • the head and neck portion may include a region from the head of the user to a second cervical vertebra of the neck.
  • the generating of the first image information may be performed by removing a white matter region from among the brain regions.
  • the method may further include generating third image information including the fat region based on the first image information and the second image information.
  • the method may further include estimating the amount of fat in the entire body from the amount of fat.
  • the method may further include estimating a disease risk of the user from the amount of fat.
  • the step of predicting the disease risk may include at least one of abdominal fat, sarcopenia, dyslipidemia, diabetes, and hyperuricemia. It can be performed by predicting the degree of risk for
  • the method may further include obtaining body information of the user, wherein the body information may include height, weight, and past medical history of the user.
  • a fat mass calculating device includes a memory for storing a program for calculating fat mass; And by executing the program, an image group including a plurality of two-dimensional images generated corresponding to the continuous volume of the user's head and neck is obtained, and the image group is input to the learned first network function to obtain information from the image group. First image information from which at least a part of the brain region is removed is generated, and the first image information is input to the learned second network function to obtain second image information obtained by detecting a fat region from the first image information. and at least one processor configured to calculate the amount of fat in the head and neck region from the second image information.
  • a computer program stored in a recording medium may be provided to execute a fat mass calculation method.
  • the fat mass calculation device and the device therefor according to embodiments according to the technical concept of the present disclosure, it is possible to quickly and accurately detect an ice area from a head and neck MR image and calculate fat mass using a neural network.
  • FIG. 1 is a flowchart illustrating a fat mass calculation method according to an embodiment of the present disclosure.
  • Figure 2 is a view for explaining the head and neck according to an embodiment of the present disclosure.
  • FIG. 3 is an exemplary diagram of image information generated through a fat mass calculation method according to an embodiment of the present disclosure.
  • FIG. 4 is an exemplary diagram of image information generated through a fat mass calculation method according to an embodiment of the present disclosure.
  • FIG. 5 is an exemplary view of a disease risk predicted based on a fat mass calculation result according to an embodiment of the present disclosure.
  • FIG. 6 is a block diagram briefly illustrating the configuration of a fat mass calculating device according to an embodiment of the present disclosure.
  • one component when one component is referred to as “connected” or “connected” to another component, the one component may be directly connected or directly connected to the other component, but in particular Unless otherwise described, it should be understood that they may be connected or connected via another component in the middle.
  • ⁇ unit means a unit that processes at least one function or operation, which includes a processor, a micro Processor (Micro Processor), Micro Controller, CPU (Central Processing Unit), GPU (Graphics Processing Unit), APU (Accelerate Processor Unit), DSP (Digital Signal Processor), ASIC (Application Specific Integrated Circuit), FPGA (Field Programmable Gate Array) may be implemented by hardware or software or a combination of hardware and software.
  • a micro Processor Micro Processor
  • Micro Controller CPU
  • CPU Central Processing Unit
  • GPU Graphics Processing Unit
  • APU Accelerate Processor Unit
  • DSP Digital Signal Processor
  • ASIC Application Specific Integrated Circuit
  • FPGA Field Programmable Gate Array
  • classification of components in the present disclosure is merely a classification for each main function in charge of each component. That is, two or more components to be described below may be combined into one component, or one component may be divided into two or more for each more subdivided function.
  • each component to be described below may additionally perform some or all of the functions of other components in addition to its main function, and some of the main functions of each component may be performed by other components. Of course, it may be dedicated and performed by .
  • a neural network may be composed of a set of interconnected computational units, which may be generally referred to as nodes, and these nodes may be referred to as neurons.
  • a neural network is generally composed of a plurality of nodes. Nodes constituting a neural network may be interconnected by one or more links.
  • Some of the nodes constituting the neural network may form one layer based on distances from the first input node. For example, a set of nodes having a distance of n from the first input node may constitute n layers.
  • the neural network described herein may include a deep neural network (DNN) including a plurality of hidden layers in addition to an input layer and an output layer.
  • DNN deep neural network
  • FIG. 1 is a flowchart illustrating a fat mass calculation method according to an embodiment of the present disclosure.
  • the fat mass calculation method 100 may be performed in a personal computer equipped with arithmetic capability, a workstation, a computer device for a server, or a separate device for this purpose.
  • the fat mass calculation method 100 may be performed in one or more computing devices.
  • at least one or more steps of the method 100 for calculating fat mass according to an embodiment of the present disclosure may be performed by a client device and other steps may be performed by a server device.
  • the client device and the server device may be connected through a network to transmit and receive calculation results.
  • the fat mass calculation method 100 may be performed by distributed computing technology.
  • the fat mass calculating device may obtain an image group including a plurality of 2D images generated corresponding to the continuous volume of the head and neck region.
  • the image group may be MR imaging images of the subject's head and neck. That is, the plurality of 2D images constituting the image group may be a plurality of slices obtained by sequentially photographing cross-sections of the head and neck in one direction through a magnetic resonance imaging method. By stacking these two-dimensional images, it is possible to obtain three-dimensional image information on the head and neck.
  • the head and neck region may include a region from the head of the user to the second cervical vertebra in the neck region.
  • the fat mass calculation device may input the image group to the learned first network function to generate first image information in which at least a part of a brain region is removed from the image group.
  • the first network function may be one in which brain region segmentation has been previously learned through training data (eg, brain region MR images in which brain region extraction and/or separation has been performed by an expert).
  • step S120 may be performed by removing a white matter region from among brain regions.
  • the fat mass calculation device may input the first image information to the learned second network function to generate second image information obtained by detecting a fat region from the first image information.
  • the second network function may be one in which fat region detection has been previously learned through learning data (eg, a head and neck MR image in which fat region detection has been performed by an expert).
  • the second image information may be another 2D image obtained by masking a fat region in a plurality of 2D images, or data including coordinate information on the fat region.
  • the first network function and the second network function may be generated by the same or different learning models.
  • the fat amount calculating device may calculate the amount of fat in the head and neck from the second image information.
  • step S140 may include calculating an area of a fat region in each 2D image in the second image information; and calculating a fat amount by summing up the areas of each 2D image.
  • step S140 may be performed by calculating the amount of fat based on the number of pixels corresponding to the fat area and the volume per pixel in the second image information.
  • the fat mass calculating device may generate third image information including a fat region based on the first image information and the second image information.
  • the third image information may include a plurality of 2D images and/or a 3D image generated by combining them.
  • the third image information may be displayed to users, medical staff, and other experts. For example, the third image information may be displayed by overlapping a head and neck image generated based on the first image information with a fat region generated based on the second image information.
  • the fat region and other regions may be displayed with different colors and/or different transparency.
  • the fat mass calculation device may acquire user's body information.
  • the body information may include at least one of the user's height, weight, and past medical history.
  • the body information may be used together with the fat mass in step S140 or independently to predict the fat mass, muscle mass, disease risk, and the like of the user's entire body.
  • the fat amount calculating device may predict the fat amount of the entire body from the fat amount.
  • the user's body information may be considered together, but is not limited thereto.
  • the fat mass calculation device may predict the disease risk from the fat mass in step S140.
  • the user's body information may be considered together, but is not limited thereto.
  • the disease risk may include a risk for at least one of abdominal fat, sarcopenia, dyslipidemia, diabetes, and hyperuricemia.
  • these diseases are exemplary and can be used to predict the risk of various diseases that have a correlation with fat mass.
  • Figure 2 is a view for explaining the head and neck according to an embodiment of the present disclosure.
  • the fat mass calculation device may acquire an image group including a plurality of 2D images generated corresponding to the continuous volume of the head and neck region, and calculate the fat mass based on the obtained image group.
  • the head and neck includes the head and the neck, and more specifically, as shown in FIG. 2 , it may include a region from the head to the second cervical vertebra (axis, C2) of the neck.
  • the image group captured for the region including the mandible that is, the region above the red line, can be used in consideration of this.
  • the target region of the image group shown in FIG. 2 is an example, and image groups for various regions may be used according to an embodiment of the present disclosure.
  • FIG. 3 is an exemplary diagram of image information generated through a fat mass calculation method according to an embodiment of the present disclosure.
  • the image group may include a plurality of 2D images generated corresponding to the continuous volume of the head and neck, and thus each 2D image may be displayed and/or a 3D image generated by combining them may be displayed.
  • first image information generated by the first network function is shown.
  • the first image information is obtained by removing a brain region from the image group, and may include a plurality of 2D images and/or a 3D image generated by combining them.
  • the image information shown in FIG. 3 is exemplary, and various image information may be used according to an embodiment of the present disclosure.
  • FIG. 4 is an exemplary diagram of image information generated through a fat mass calculation method according to an embodiment of the present disclosure.
  • third image information is shown.
  • third image information may be generated by overlapping the second image information on the first image information. That is, the third image information may relate to a head and neck image in which a fat region is displayed in the head and neck region from which the brain region is removed. In this case, the fat region may be different from other regions in color and transparency, and through this, the fat region may be more easily recognized.
  • the image information shown in FIG. 4 is exemplary, and various image information may be used according to an embodiment of the present disclosure.
  • FIG. 5 is an exemplary view of a disease risk predicted based on a fat mass calculation result according to an embodiment of the present disclosure.
  • the amount of fat is calculated through head and neck imaging, it is possible to predict the risk of diseases that are highly correlated with the amount of fat. For example, stages such as normal, low, normal, and risk may be classified according to the amount of fat in the head and neck region, and the risk level for the disease may be predicted based on the classification. At this time, the user's body information may be used together.
  • the amount of fat calculated through the head and neck image is 1529.84 cm 3 , and considering the amount of fat and the user's body information, the user has abdominal fat, sarcopenia, and dyslipidemia. ), and diabetes are low, but the risk of hyperuricemia is extremely high, indicating the need for more detailed examination and treatment.
  • the disease risk shown in FIG. 5 is an example, and the risk of various diseases can be predicted according to an embodiment of the present disclosure.
  • FIG. 6 is a block diagram briefly illustrating the configuration of a fat mass calculating device according to an embodiment of the present disclosure.
  • the communication unit 610 may receive input data (head and neck MR images, etc.) for fat mass calculation.
  • the communication unit 610 may include a wired/wireless communication unit.
  • the communication unit 610 may include a local area network (LAN), a wide area network (WAN), a value added network (VAN), and a mobile communication network ( mobile radio communication network), a satellite communication network, and one or more components that enable communication through a mutual combination thereof.
  • the communication unit 610 includes a wireless communication unit, the communication unit 610 transmits and receives data or signals wirelessly using cellular communication, a wireless LAN (eg, Wi-Fi), and the like. can
  • the communication unit 610 may transmit/receive data or signals with an external device or an external server under the control of the processor 640 .
  • the input unit 620 may receive various user commands through external manipulation.
  • the input unit 620 may include or connect one or more input devices.
  • the input unit 620 may receive user commands by being connected to various input interfaces such as a keypad and a mouse.
  • the input unit 620 may include an interface such as a thunderbolt as well as a USB port.
  • the input unit 620 may receive an external user command by including or combining various input devices such as a touch screen and buttons.
  • the memory 630 may store programs for operation of the processor 640 and may temporarily or permanently store input/output data.
  • the memory 630 may include a flash memory 630 type, a hard disk type, a multimedia card micro type, or a card type memory 630 (for example, SD or XD memory ( 630), RAM, SRAM, ROM, EEPROM, PROM, magnetic memory 630, a magnetic disk, and an optical disk.
  • the memory 630 may store various network functions and algorithms, and may store various data, programs (one or more instructions), applications, software, commands, codes, etc. for driving and controlling the device.
  • the processor 640 may control the overall operation of the device. Processor 640 may execute one or more programs stored in memory 630 .
  • the processor 640 may mean a central processing unit (CPU), a graphics processing unit (GPU), or a dedicated processor 640 on which methods according to the technical idea of the present disclosure are performed.
  • the processor 640 acquires an image group including a plurality of 2D images generated corresponding to the continuous volume of the head and neck of the user, and inputs the image group to the learned first network function to obtain the image group.
  • a second image obtained by generating first image information from which at least a part of brain regions in a group is removed, and inputting the first image information to a learned second network function to detect a fat region from the first image information information, and the amount of fat in the head and neck can be calculated from the second image information.
  • a fat mass calculation method may be implemented in the form of program instructions that can be executed through various computer means and recorded on a computer readable medium.
  • the computer readable medium may include program instructions, data files, data structures, etc. alone or in combination.
  • Program commands recorded on the medium may be specially designed and configured for the present disclosure, or may be known and usable to those skilled in computer software.
  • Examples of computer-readable recording media include magnetic media such as hard disks, floppy disks and magnetic tapes, optical media such as CD-ROMs and DVDs, and magnetic media such as floptical disks.
  • - includes hardware devices specially configured to store and execute program instructions, such as magneto-optical media, and ROM, RAM, flash memory, and the like.
  • Examples of program instructions include high-level language codes that can be executed by a computer using an interpreter, as well as machine language codes such as those produced by a compiler.
  • the fat mass calculation method according to the disclosed embodiments may be included and provided in a computer program product.
  • Computer program products may be traded between sellers and buyers as commodities.
  • a computer program product may include a S/W program and a computer-readable storage medium in which the S/W program is stored.
  • a computer program product may include a product in the form of a S/W program (eg, a downloadable app) that is distributed electronically through a manufacturer of an electronic device or an electronic marketplace (eg, Google Play Store, App Store). there is.
  • a part of the S/W program may be stored in a storage medium or temporarily generated.
  • the storage medium may be a storage medium of a manufacturer's server, an electronic market server, or a relay server temporarily storing SW programs.
  • a computer program product may include a storage medium of a server or a storage medium of a client device in a system composed of a server and a client device.
  • the computer program product may include a storage medium of the third device.
  • the computer program product may include a S/W program itself transmitted from the server to the client device or the third device or from the third device to the client device.
  • one of the server, the client device and the third device may execute the computer program product to perform the method according to the disclosed embodiments.
  • two or more of the server, the client device, and the third device may execute the computer program product to implement the method according to the disclosed embodiments in a distributed manner.
  • a server may execute a computer program product stored in the server to control a client device communicatively connected to the server to perform a method according to the disclosed embodiments.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Pathology (AREA)
  • Physics & Mathematics (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Molecular Biology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biophysics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Data Mining & Analysis (AREA)
  • Primary Health Care (AREA)
  • Databases & Information Systems (AREA)
  • Epidemiology (AREA)
  • Psychiatry (AREA)
  • Physiology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Artificial Intelligence (AREA)
  • Signal Processing (AREA)
  • Magnetic Resonance Imaging Apparatus (AREA)
  • Image Analysis (AREA)

Abstract

La présente invention concerne un procédé de calcul de masse grasse et un dispositif associé. Un procédé de calcul de masse grasse selon un mode de réalisation divulgué dans la présente invention peut comprendre les étapes consistant à : acquérir un groupe d'images comprenant une pluralité d'images bidimensionnelles générées en correspondance avec le volume continu de la tête et du cou d'un utilisateur; entrer le groupe d'images dans une première fonction de réseau entraînée pour générer des premières informations d'image dans lesquelles au moins une partie d'une zone cérébrale a été retirée du groupe d'images; entrer les premières informations d'image dans une seconde fonction de réseau entraînée pour générer des secondes informations d'image dans lesquelles une zone grasse a été détectée dans les premières informations d'image; et calculer la masse grasse dans la tête et le cou à partir des secondes informations d'image.
PCT/KR2021/013299 2021-07-09 2021-09-29 Procédé de calcul de masse grasse utilisant une image de tête et de cou et dispositif associé WO2023282389A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020210090174A KR102325970B1 (ko) 2021-07-09 2021-07-09 두경부 영상 이미지를 이용한 지방량 산출 방법 및 이를 위한 장치
KR10-2021-0090174 2021-07-09

Publications (1)

Publication Number Publication Date
WO2023282389A1 true WO2023282389A1 (fr) 2023-01-12

Family

ID=78716861

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2021/013299 WO2023282389A1 (fr) 2021-07-09 2021-09-29 Procédé de calcul de masse grasse utilisant une image de tête et de cou et dispositif associé

Country Status (2)

Country Link
KR (2) KR102325970B1 (fr)
WO (1) WO2023282389A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116509389B (zh) * 2023-06-27 2023-09-01 深圳启脉科技有限公司 一种基于射频的血脂监测方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20010110701A (ko) * 2000-01-27 2001-12-13 추후제출 물 및 지방 분리 자기공명 영상화
JP2017202311A (ja) * 2016-05-09 2017-11-16 東芝メディカルシステムズ株式会社 医用画像診断装置及び管理装置
KR20190060606A (ko) * 2017-11-24 2019-06-03 삼성전자주식회사 의료 영상 진단 장치 및 방법
KR102118723B1 (ko) * 2019-10-17 2020-06-04 메디컬아이피 주식회사 복부 영상 분석 방법 및 그 장치
KR20210073622A (ko) * 2019-12-09 2021-06-21 시너지에이아이 주식회사 인공신경망을 이용한 장기의 부피 측정 방법 및 그 장치

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20010110701A (ko) * 2000-01-27 2001-12-13 추후제출 물 및 지방 분리 자기공명 영상화
JP2017202311A (ja) * 2016-05-09 2017-11-16 東芝メディカルシステムズ株式会社 医用画像診断装置及び管理装置
KR20190060606A (ko) * 2017-11-24 2019-06-03 삼성전자주식회사 의료 영상 진단 장치 및 방법
KR102118723B1 (ko) * 2019-10-17 2020-06-04 메디컬아이피 주식회사 복부 영상 분석 방법 및 그 장치
KR20210073622A (ko) * 2019-12-09 2021-06-21 시너지에이아이 주식회사 인공신경망을 이용한 장기의 부피 측정 방법 및 그 장치

Also Published As

Publication number Publication date
KR102325970B1 (ko) 2021-11-16
KR20230010164A (ko) 2023-01-18

Similar Documents

Publication Publication Date Title
WO2020207377A1 (fr) Procédé, dispositif et système de formation de modèle de reconnaissance d'image et de reconnaissance d'image
WO2019208848A1 (fr) Procédé de mesure de mouvement de globe oculaire tridimensionnel et système de diagnostic d'étourdissement basé sur un apprentissage profond automatique
JP7057959B2 (ja) 動作解析装置
WO2019132589A1 (fr) Dispositif de traitement d'images et procédé de détection d'objets multiples
WO2021071288A1 (fr) Procédé et dispositif de formation de modèle de diagnostic de fracture
WO2017051943A1 (fr) Procédé et appareil de génération d'image, et procédé d'analyse d'image
CN105612533A (zh) 活体检测方法、活体检测系统以及计算机程序产品
WO2019198850A1 (fr) Procédé et système de génération d'un avatar en forme d'animal à l'aide d'un visage humain
WO2015182904A1 (fr) Appareil d'étude de zone d'intérêt et procédé de détection d'objet d'intérêt
WO2022145841A1 (fr) Procédé d'interprétation de lésion et appareil associé
CN102697446A (zh) 图像处理装置和图像处理方法
WO2023282389A1 (fr) Procédé de calcul de masse grasse utilisant une image de tête et de cou et dispositif associé
WO2022131642A1 (fr) Appareil et procédé pour déterminer la gravité d'une maladie sur la base d'images médicales
WO2023095989A1 (fr) Procédé et dispositif d'analyse d'images médicales à modalités multiples pour le diagnostic d'une maladie cérébrale
WO2023273297A1 (fr) Procédé et appareil de détection de corps vivant basée sur la multimodalité, dispositif électronique et support de stockage
WO2024010390A1 (fr) Procédé, programme et dispositif de surveillance de commande de robot médical
Weales et al. A robust machine vision system for body measurements of beef calves
WO2014204126A2 (fr) Appareil de capture d'images ultrasonores en 3d et procédé pour le faire fonctionner
WO2019164277A1 (fr) Procédé et dispositif d'évaluation de saignement par utilisation d'une image chirurgicale
WO2021225422A1 (fr) Procédé et appareil pour fournir des informations associées à des phénotypes immunitaires pour une image de lame d'anatomopathologie
WO2021177771A1 (fr) Procédé et système pour prédire l'expression d'un biomarqueur à partir d'une image médicale
WO2015137542A1 (fr) Dispositif de traitement d'images médicales pour diagnostic médical et procédé associé
WO2023163287A1 (fr) Procédé et appareil d'analyse d'image médicale
WO2023058837A1 (fr) Procédé de détection de diaphragme à partir d'image de poitrine, et appareil associé
WO2022220383A1 (fr) Procédé et système de mesure de changement de taille d'une lésion cible dans un cliché radiographique

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21949428

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE