CN113520443A - PET system parameter recommendation method and PET system - Google Patents

PET system parameter recommendation method and PET system Download PDF

Info

Publication number
CN113520443A
CN113520443A CN202110952756.6A CN202110952756A CN113520443A CN 113520443 A CN113520443 A CN 113520443A CN 202110952756 A CN202110952756 A CN 202110952756A CN 113520443 A CN113520443 A CN 113520443A
Authority
CN
China
Prior art keywords
anatomical feature
scanning
feature point
body part
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110952756.6A
Other languages
Chinese (zh)
Inventor
顾笑悦
王超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai United Imaging Healthcare Co Ltd
Original Assignee
Shanghai United Imaging Healthcare Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai United Imaging Healthcare Co Ltd filed Critical Shanghai United Imaging Healthcare Co Ltd
Priority to CN202110952756.6A priority Critical patent/CN113520443A/en
Publication of CN113520443A publication Critical patent/CN113520443A/en
Priority to EP22857918.1A priority patent/EP4329624A4/en
Priority to PCT/CN2022/113544 priority patent/WO2023020609A1/en
Priority to US18/434,934 priority patent/US20240242400A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/02Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computed tomography [CT]
    • A61B6/037Emission tomography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/50Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications
    • A61B6/501Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications for diagnosis of the head, e.g. neuroimaging or craniography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5211Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5258Devices using data or image processing specially adapted for radiation diagnosis involving detection or reduction of artifacts or noise
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Biophysics (AREA)
  • Theoretical Computer Science (AREA)
  • Molecular Biology (AREA)
  • Public Health (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Animal Behavior & Ethology (AREA)
  • Surgery (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Optics & Photonics (AREA)
  • Veterinary Medicine (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Computation (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Computing Systems (AREA)
  • Computational Linguistics (AREA)
  • Mathematical Physics (AREA)
  • General Business, Economics & Management (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Business, Economics & Management (AREA)
  • Evolutionary Biology (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Neurology (AREA)
  • Neurosurgery (AREA)
  • Dentistry (AREA)

Abstract

The application relates to a PET system parameter recommendation method and a PET system, wherein the PET system parameter recommendation method comprises the following steps: acquiring a scout image of a scanned object; identifying anatomical feature points of the scanned object in the scout image; determining an incidence relation between the anatomical feature point and a scanning bed; and recommending the scanning parameters and/or the image reconstruction parameters according to the association relation. Through this application, it just can carry out the PET scanning after needing artifical parameter to different health positions to set up among the correlation technique to have solved, wastes time and energy and the lower problem of efficiency, has improved the efficiency of PET scanning.

Description

PET system parameter recommendation method and PET system
Technical Field
The application relates to the technical field of medical equipment, in particular to a PET system parameter recommendation method and a PET system.
Background
Positron Emission Tomography (PET) is an advanced clinical examination imaging technique in the field of nuclear medicine. Before scanning with the PET system, it is necessary to first scan the substances necessary for the metabolism of living organisms, such as: glucose, protein, nucleic acid, fatty acid, short-lived radionuclides labeled with these substances, and after the substances are injected into the human body, the aggregation of the substances in the metabolism reflects the metabolic activity of the life, thereby achieving the purpose of diagnosis.
In the related art, in order to optimize the scan parameters or reconstruction parameters corresponding to different body parts, a technician is usually required to set the parameters of the different body parts by a manual method, which is time-consuming, labor-consuming and inefficient.
No effective solution to the above problems has been proposed.
Disclosure of Invention
The embodiment of the application provides a PET system parameter recommendation method and a PET system, and aims to at least solve the problems that in the related art, PET scanning can be performed only after parameters of different body parts are manually set, time and labor are wasted, and efficiency is low.
In a first aspect, an embodiment of the present application provides a PET system parameter recommendation method, including:
acquiring a scout image of a scanned object;
identifying anatomical feature points of the scanned object in the scout image;
determining an incidence relation between the anatomical feature point and a scanning bed;
and recommending scanning parameters and/or image reconstruction parameters according to the incidence relation.
In some of these embodiments, the determining the association between the anatomical feature point and the scanning bed comprises:
acquiring a body part label of each anatomical feature point;
counting label information of body part labels of the anatomical feature points;
classifying the anatomical feature points according to the label information, and determining a classification result;
and determining the association relationship between the anatomical feature point and the scanning bed according to the classification result.
In some of these embodiments, the determining the association relationship between the anatomical feature point and the scanning bed according to the classification result includes:
determining a first corresponding relation between the anatomical feature point and the body part of the scanning object according to label information of a body part label of the anatomical feature point of the scanning object;
determining a second corresponding relation between the body part of the scanning object and the scanning bed according to the first corresponding relation;
and determining the incidence relation between the anatomical feature point and the scanning bed according to the second corresponding relation.
In some embodiments, the determining, according to the label information of the body part label of the anatomical feature point of the scanning object, the first corresponding relationship between the anatomical feature point and the body part of the scanning object includes:
the label information comprises the number of body part labels, the categories and the proportion of each category to the total number;
and determining a first corresponding relation between the anatomical feature point and the body part of the scanning object according to the proportion of each category of the body part labels in the label information in the total number.
In some of these embodiments, the determining the association between the anatomical feature point and the scanning bed comprises:
identifying key anatomical feature points among the anatomical feature points;
determining the body part to which the key anatomical feature point belongs, and determining the association relationship between the anatomical feature point and the scanning bed according to the body part to which the key anatomical feature point belongs.
In some of these embodiments, the method further comprises:
displaying a list of the scanning parameters and/or image reconstruction parameters on a human-computer interaction interface;
selecting and/or confirming scan parameters and/or reconstruction parameters in the list;
scanning and/or image reconstruction is performed based on the selection and/or validation results.
In some of these embodiments, the list of image reconstruction parameters includes motion correction parameters.
In some of these embodiments, the list of scan parameters includes a motion monitoring parameter.
In some of these embodiments, anatomical feature points of the scanned object in the scout image are identified or classified according to a depth learning model.
In a second aspect, an embodiment of the present application provides a PET system, comprising a scanner and a processor:
the scanner is used for acquiring a positioning image of a scanned object;
the processor identifying anatomical feature points of the scanned object in the scout image;
the processor determining an association between the anatomical feature point and a scanning bed;
and the processor recommends scanning parameters and/or image reconstruction parameters according to the incidence relation.
Compared with the related art, the PET system parameter recommendation method provided by the embodiment of the application acquires the positioning image of the scanned object; identifying anatomical feature points of the scanned object in the scout image; determining an incidence relation between the anatomical feature point and a scanning bed; and recommending the scanning parameters and/or the image reconstruction parameters according to the association relation. The problem of need among the correlation technique artifical just can carry out the PET scanning after setting up the parameter of different health positions, waste time and energy and efficiency are lower is solved, the efficiency of PET scanning has been improved.
The details of one or more embodiments of the application are set forth in the accompanying drawings and the description below to provide a more thorough understanding of the application.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
FIG. 1 is a schematic diagram of an application environment of a PET system parameter recommendation method according to an embodiment of the present application;
FIG. 2 is a flow chart of a PET system parameter recommendation method according to an embodiment of the present application;
FIG. 3 is a flow chart of a method of determining an association according to an embodiment of the application;
FIG. 4 is a flow chart of another PET system parameter recommendation method according to an embodiment of the present application;
FIG. 5 is a schematic diagram of a human-machine interface according to an embodiment of the application;
FIG. 6 is a schematic view of a scanning bed according to an embodiment of the present application;
FIG. 7 is a schematic illustration of anatomical feature points and body parts according to an embodiment of the application;
fig. 8 is a block diagram of a hardware structure of a terminal of a PET system parameter recommendation method according to an embodiment of the present application;
FIG. 9 is a block diagram of a PET system parameter recommendation device according to an embodiment of the present application;
fig. 10 is a block diagram of a PET system according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be described and illustrated below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments provided in the present application without any inventive step are within the scope of protection of the present application. Moreover, it should be appreciated that in the development of any such actual implementation, as in any engineering or design project, numerous implementation-specific decisions must be made to achieve the developers' specific goals, such as compliance with system-related and business-related constraints, which may vary from one implementation to another.
Reference in the specification to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the specification. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of ordinary skill in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments without conflict.
Unless defined otherwise, technical or scientific terms referred to herein shall have the ordinary meaning as understood by those of ordinary skill in the art to which this application belongs. Reference to "a," "an," "the," and similar words throughout this application are not to be construed as limiting in number, and may refer to the singular or the plural. The present application is directed to the use of the terms "including," "comprising," "having," and any variations thereof, which are intended to cover non-exclusive inclusions; for example, a process, method, system, article, or apparatus that comprises a list of steps or modules (elements) is not limited to the listed steps or elements, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus. Reference to "connected," "coupled," and the like in this application is not intended to be limited to physical or mechanical connections, but may include electrical connections, whether direct or indirect. Reference herein to "a plurality" means greater than or equal to two. "and/or" describes an association relationship of associated objects, meaning that three relationships may exist, for example, "A and/or B" may mean: a exists alone, A and B exist simultaneously, and B exists alone. Reference herein to the terms "first," "second," "third," and the like, are merely to distinguish similar objects and do not denote a particular ordering for the objects.
The PET system parameter recommendation method provided by the present application may be applied to an application environment shown in fig. 1, and fig. 1 is an application environment schematic diagram of the PET system parameter recommendation method according to the embodiment of the present application, as shown in fig. 1. The PET system comprises a data acquisition device 101, a scanning bed 102, a host machine 103 and a reconstruction machine 104, and a doctor controls the data acquisition device 101 to scan a patient on the scanning bed 102 through the host machine 103 to obtain scanning data of the patient. The host 103 sends the acquired scan data to the reconstructor 104 for image reconstruction, and finally obtains a scan image.
The embodiment provides a PET system parameter recommendation method. FIG. 2 is a flow chart of a PET system parameter recommendation method according to an embodiment of the present application, as shown in FIG. 2, the method includes the following steps:
step S210, a scout image of the scan object is acquired.
In the PET system parameter recommendation method in this embodiment, before a scanning object is formally scanned, a scout image of the scanning object needs to be acquired first to determine a specific scanning location and corresponding scanning parameters and/or reconstruction parameters.
The scanning object can be a patient, the scanning part is one or more body parts of the scanning object, and the positioning image is a plane image of an examination area obtained by fast scanning and comprises a normal position and a lateral position. Specifically, when the scanning object is loaded in the scanning area, the scanning area is scanned in a positioning manner, so that a positioning image of the scanning object is obtained. Accordingly, the specific position of the scanning object in the scanning area can be known from the scout image, so as to facilitate further scanning planning of the scanning object. Taking the part to be inspected of the scanning object as the head part as an example, the specific position of the head part of the scanning object in the scanning area can be obtained through the positioning image, so that the head part of the scanning object is scanned, and other parts of the scanning object are not scanned.
Step S220, identify the anatomical feature points of the scanned object in the scout image.
Obviously, during the scanning process of the PET system, the morphological structure of the scanned object, for example, the positions of the bones and organs, can be obtained, so that the anatomical feature points of different body parts can be determined in the scout image according to the morphological structure, and the anatomical feature points in this embodiment can be the marker points capable of distinguishing the different body parts of the scanned object, for example, the skull top, the zygomatic bone, the lung tip, the diaphragm top, and so on.
And step S230, determining the association relationship between the anatomical feature point and the scanning bed.
During a scout scan, the PET system can acquire a plurality of anatomical feature points of the scanned object. Such as the cranial crown, the cheekbones, the mandible, the shoulder joints, the lung apex, the septum, the femoral joints, the knee, and the like.
When the scanning object is scanned, different anatomical feature points of the scanning object correspond to different scanning beds, so each anatomical feature point has a scanning bed associated with the anatomical feature point. For example, in the case of 2 scanning beds, the skull top and the cheekbone may belong to a first scanning bed, and the lung tip may belong to a second scanning bed, so that the skull top and the cheekbone are associated with the first scanning bed, and the lung tip is associated with the second scanning bed. Specifically, the scanning bed positions are different positions of the scanning bed in the scanning process.
And step S240, recommending scanning parameters and/or image reconstruction parameters according to the association relation.
The scanning parameters include radiation dose, scanning duration, and the like, and the image reconstruction parameters include algorithms in the image reconstruction process, and the like. Specifically, under a certain scanning bed, the scanning parameters and/or reconstruction parameters corresponding to the anatomical feature points are determined, and the scanning parameters and/or reconstruction parameters during scanning under the scanning bed are determined according to the incidence relation between the anatomical feature points and the scanning bed.
Through the steps S210 to S240, after the scout image is obtained, the anatomical feature points in the scout image are identified, and then the scanning parameters and/or the image reconstruction parameters during scanning are determined based on the association relationship between the anatomical feature points and the scanning bed. In the related art, scan planning is performed based on a scout image, and a technician is required to manually set parameters of different body parts. In the embodiment, the corresponding scanning parameters and/or image reconstruction parameters can be recommended by identifying the anatomical feature points in the positioning image, so that the problems that the parameters of different body parts need to be manually set and then can be scanned in the related technology, time and labor are wasted, and the efficiency is low are solved, and the scanning efficiency of the PET system is improved.
In some of these embodiments, anatomical feature points of the scanned object in the scout image are identified based on a deep learning model. The deep learning model in this embodiment may be obtained through training, and specifically, the anatomical feature points may be labeled in the positioning image first, and then training is performed according to the labeled positioning image, so as to obtain the deep learning model in this embodiment, so as to improve the accuracy and efficiency of identifying the anatomical feature points.
In some of these embodiments, the anatomical feature points are classified according to a deep learning model. Similarly, the deep learning model in this embodiment may be obtained through training, and specifically, a plurality of different types of anatomical feature points may be preset, the type of the body part to which each feature point belongs may be labeled, and training may be performed according to the anatomical feature points and the labeled types to obtain a final deep learning model, so as to improve the accuracy of classifying the anatomical feature points.
In other embodiments, the classification rules between the anatomical feature points and the types of the body parts may be directly set, and the classification rules may be input into the deep learning model for use.
In some embodiments, fig. 3 is a flowchart of a method for determining an association relationship according to an embodiment of the present application, and as shown in fig. 3, the method includes the following steps:
in step S310, a body part label of each anatomical feature point is obtained.
Wherein the body part label represents a body part corresponding to the anatomical feature point, e.g., head, chest, abdomen, lower extremities, etc. The correspondence between the anatomical feature points and the body part tags may be preset. Optionally, the body part label of each anatomical feature point may be manually marked, or may be automatically identified by the processor according to a preset correspondence.
Step S320, counting label information of the body part labels of the anatomical feature points.
In this embodiment, the label information of the body part labels is statistical information of all body part labels, including the number, the category, and the like.
And step S330, classifying the anatomical feature points according to the label information and determining a classification result.
After obtaining the label information, the classification result obtained directly can be obtained from the label information of the body part label.
Specifically, the anatomical feature points belong to different body parts, for example, the vertex, cheekbones, the mandible belong to the head, the lung apex and septum belong to the chest, and the femoral joint and the knee belong to the lower limb. Therefore, after a plurality of anatomical feature points are identified from the scout image, in order to enable each body part to be scanned and reconstructed under corresponding parameters, the anatomical feature points need to be classified, which body part each anatomical feature point belongs to is determined, and finally, a classification result is obtained.
Preferably, the identified number and type of the anatomical feature points, the type of the body part, and the correspondence between the anatomical feature points and the body part may be preset according to the scene and the requirement.
And step S340, determining the association relationship between the anatomical feature point and the scanning bed according to the classification result.
Through the steps from S310 to S340, the body part labels of all the anatomical feature points are subjected to statistical analysis to obtain label information, all the anatomical feature points are classified according to the label information, and the association relationship is determined based on the classification result, so that the classification accuracy of the anatomical feature points is improved.
Further, after obtaining the label information, determining the association relationship between the anatomical feature point and the scanning bed according to the classification result may be implemented as follows: determining a first corresponding relation between the anatomical feature points and the body part of the scanning object according to label information of body part labels of the anatomical feature points of the scanning object; determining a second corresponding relation between the body part of the scanning object and the scanning bed according to the first corresponding relation; and determining the incidence relation between the anatomical feature point and the scanning bed according to the second corresponding relation. In particular, different anatomical feature points may belong to different body parts, a first corresponding relationship between the two may be preset by a user, and in the using process, the first corresponding relationship may be identified by a neural network subjected to deep learning. Meanwhile, in the scanning process, the scanning object and the scanning bed are relatively fixed, so that a second corresponding relation exists between each body part of the scanning object and the scanning bed, and finally, the association relation between the anatomical feature point and the scanning bed can be determined according to the second corresponding relation. For example, the scanning object in the scout image is divided into three scanning beds during scanning, each body part corresponds to one scanning bed, the specific type of the body part is unknown at this time, and after the types of the body part are obtained as the head, the chest and the abdomen according to the label information, it can be determined that the body parts corresponding to the three scanning beds are respectively the anatomical feature point of the head, the anatomical feature point of the chest and the anatomical feature point of the abdomen. In this embodiment, the final association relationship is determined based on the first and second correspondence relationships, so that the accuracy of the association relationship can be improved, and further, since different body parts require different parameters during scanning and image reconstruction, scanning parameters and/or image reconstruction parameters can be recommended according to the association relationship.
In some of these embodiments, the tag information includes the number of body part tags, the categories, and the proportion of each category to the total number. The number of the body part labels is determined according to the number of the anatomical feature points, and under the condition that all the anatomical feature points are subjected to statistical analysis, the number of the body part labels is consistent with the number of the anatomical feature points, the category is the category of the body part to which the anatomical feature points belong, and preferably, the number of the categories and the number of the body part labels in each category can be counted while the categories are counted. For example, when there are 10 anatomical feature points and statistical analysis is performed, if a label is added to each anatomical feature point, the number of body part labels is 10, and after statistics, the 10 body part labels can be classified into three categories, i.e., head, chest, and lower abdomen, and the categories are obtained as head, chest, and abdomen.
Further, the total number refers to the number of body part labels contained in each scanning bed, and the proportion of each category to the total number refers to the proportion of each category to the total number of the scanning bed under one scanning bed.
In general, each scanning bed has only one type of body part label, and in this case, the type is the body part corresponding to the scanning bed, and further, if two or more types of body parts are included in one scanning bed, it is necessary to determine the first corresponding relationship between the anatomical feature point and the body part to be scanned according to the ratio of each type of body part label in the label information to the total number. For example, if a scanning bed includes 4 body part tags, namely 1 head body part tag and 3 abdomen body part tags, obviously, the abdomen accounts for a larger proportion of the total number, and then the body part corresponding to the scanning bed is determined as the abdomen. In the embodiment, various conditions during bed division are considered, and the scene adaptability of the PET system parameter recommendation method is improved.
In some embodiments, the association relationship between the anatomical feature point and the scanning bed can also be determined by identifying the position of a preset key anatomical feature point. Specifically, a key anatomical feature point is identified in a plurality of anatomical feature points through a neural network model, wherein the key anatomical feature point may be one of the plurality of anatomical feature points or may be determined jointly by the position of one or more anatomical feature points, after the key anatomical feature point is determined, a body part to which the key anatomical feature point belongs is determined, and the association relationship between the anatomical feature point and the scanning bed is determined according to the body part to which the key anatomical feature point belongs. In this embodiment, the association relationship is determined based on the key anatomical feature points, so that interference caused by inaccuracy of some anatomical feature points in the identification process can be avoided, and the accuracy of association relationship determination is further improved.
In some embodiments, fig. 4 is a flowchart of another PET system parameter recommendation method according to embodiments of the present application, as shown in fig. 4, the method including the steps of:
step S410, displaying a list of scanning parameters and/or image reconstruction parameters on a human-computer interaction interface;
step S420, selecting and/or confirming the scanning parameters and/or the reconstruction parameters in the list;
step S430, performing scanning and/or image reconstruction according to the selection and/or confirmation result.
In this embodiment, after obtaining the scanning parameters and/or the image reconstruction parameters according to the association relationship, the scanning parameters and/or the image reconstruction parameters may be confirmed again through a human-computer interaction interface, specifically, fig. 5 is a schematic diagram of the human-computer interaction interface according to an embodiment of the present application, as shown in fig. 5, a list of image reconstruction parameters is displayed on the human-computer interaction interface, where the list includes four parts, each of which has multiple options, for example, an option having an algorithm parameter, such as a flight time, in the reconstruction part, an option having a correction type in the correction part, an option having an image parameter in the image part, and an option having a reconstruction type in the allocation part. In the list of the human-computer interaction interface, the scanning parameters and/or the image reconstruction parameters obtained according to the incidence relation are selected in advance, a doctor can reselect and/or confirm the parameters in the list according to the actual situation, and the PET system can also autonomously modify the options in the list according to other conditions. The selection may be to check different options in the list, and the confirmation may be to confirm all the selected options. In an actual scenario, the selection and confirmation functions may be set as desired. The PET system obtains the final scanning parameters and/or image reconstruction parameters according to the options in the list, scans the scanning object according to the scanning parameters, and reconstructs the image according to the image reconstruction parameters.
Through the steps S410 to S430, the selection and/or confirmation of the scanning parameters and/or the image reconstruction parameters is realized based on the visualized human-computer interface, so as to enhance the interactivity in the scanning process.
In some of these embodiments, the list of image reconstruction parameters includes motion correction parameters. The motion correction parameters in this embodiment are used to correct rigid motion of a body part, for example, during a scanning process, a head, a chest, or an abdomen of a scanning object may be displaced, and at this time, the displacement needs to be corrected, so as to avoid artifacts in a reconstructed image.
Further, if the respiratory motion of the scanning object is too severe, the chest and abdomen rise and fall too much, which also affects the scanning and reconstruction process, and similarly, the heart beat also affects the scanning and reconstruction process, so when the body part corresponding to the scanning bed is identified to include the chest or the abdomen, the list of the scanning parameters further includes the motion monitoring parameters, so as to correct the artifact in the reconstruction process.
The embodiments of the present application will be described and illustrated in the following practical scenarios.
Fig. 6 is a schematic view of a scanning bed according to an embodiment of the present application, and as shown in fig. 6, when a scout image is processed, a virtual scanning bed frame is displayed in a rectangular shape in the scout image, and the plurality of scanning bed frames in fig. 6 correspond to the head, the chest, the abdomen, the basin, and the lower limbs, respectively.
Fig. 7 is a schematic diagram of anatomical feature points and a body part according to an embodiment of the application, where the dots with different gray levels represent different anatomical feature points, and a first scanning bed, a second scanning bed, a third scanning bed, a fourth scanning bed and a fifth scanning bed have been planned while identifying the anatomical feature points, as shown in fig. 7. Then, the body parts to which the anatomical feature points belong are classified, and finally the first scanning bed is determined to be the scanning bed of the head, the second scanning bed is the scanning bed of the chest, the third scanning bed is the scanning bed of the abdomen, the fourth scanning bed is the scanning bed of the pelvic part, and the fifth scanning bed is the scanning bed of the lower limbs. After the specific scanning bed is identified, corresponding scanning parameters and/or image reconstruction parameters can be recommended according to the body part.
For example, when the skull top and the cheekbones are identified in the anatomical feature points and are both within the range of the first scanning bed, the first scanning bed is determined to be the scanning bed of the head, and then the PET system can automatically check the head motion detection in the scanning parameters; when the anatomical feature points identify the lung apex and the diaphragm top, the positions of the lung apex and the diaphragm top are detected, the lower positions of the left and right lung apices are used as the upper boundary of the lung, the diaphragm top is used as the lower boundary of the lung, the middle point of the upper boundary and the lower boundary of the lung is used as a chest positioning identification point, if the chest positioning identification point is in the second scanning bed position, the second scanning bed position is determined to be the scanning bed position of the chest, and then the PET system can automatically select the respiratory motion detection in the scanning parameters.
Furthermore, the positioning image has real-time interactivity, after the position of the scanning bed is changed, the position of the scanning bed can be detected in real time, the scanning bed after the position is changed is positioned and identified, and meanwhile, the recommendation of scanning parameters and/or image reconstruction parameters is carried out according to the label information of the body part label after the position is changed.
It should be noted that the PET system parameter recommendation method in the present application is applied to a short axis PET system, and only one scanning bed is scanned each time, and after one scanning bed is finished, the PET system automatically switches to the scanning parameter corresponding to the next scanning bed to scan the scanning object.
It should be noted that the steps illustrated in the above-described flow diagrams or in the flow diagrams of the figures may be performed in a computer system, such as a set of computer-executable instructions, and that, although a logical order is illustrated in the flow diagrams, in some cases, the steps illustrated or described may be performed in an order different than here.
The method embodiments provided in the present application may be executed in a terminal, a computer or a similar computing device. Taking the example of operating on a terminal, fig. 8 is a hardware structure block diagram of the terminal of the PET system parameter recommendation method according to the embodiment of the present application. As shown in fig. 8, the terminal 80 may include one or more processors 802 (only one is shown in fig. 8) (the processor 802 may include, but is not limited to, a processing device such as a microprocessor MCU or a programmable logic device FPGA) and a memory 804 for storing data, and optionally may also include a transmission device 806 for communication functions and an input-output device 808. It will be understood by those skilled in the art that the structure shown in fig. 8 is only an illustration and is not intended to limit the structure of the terminal. For example, terminal 80 may also include more or fewer components than shown in FIG. 8, or have a different configuration than shown in FIG. 8.
The memory 804 can be used for storing control programs, for example, software programs and modules of application software, such as a control program corresponding to the PET system parameter recommendation method in the embodiment of the present application, and the processor 802 executes various functional applications and data processing by running the control program stored in the memory 804, so as to implement the above-mentioned method. The memory 804 may include high-speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, the memory 804 can further include memory located remotely from the processor 802, which can be connected to the terminal 80 via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The transmission device 806 is used to receive or transmit data via a network. Specific examples of the network described above may include a wireless network provided by a communication provider of the terminal 80. In one example, the transmission device 806 includes a Network adapter (NIC) that can be connected to other Network devices via a base station to communicate with the internet. In one example, the transmission device 806 can be a Radio Frequency (RF) module, which is used to communicate with the internet in a wireless manner.
The embodiment also provides a PET system parameter recommendation device, which is used for implementing the above embodiments and preferred embodiments, and the description of the device is omitted. As used hereinafter, the terms "module," "unit," "subunit," and the like may implement a combination of software and/or hardware for a predetermined function. Although the means described in the embodiments below are preferably implemented in software, an implementation in hardware, or a combination of software and hardware is also possible and contemplated.
Fig. 9 is a block diagram of a PET system parameter recommendation apparatus according to an embodiment of the present application, and as shown in fig. 9, the apparatus includes an acquisition unit 91, a determination unit 92, and a recommendation unit 93:
an acquisition unit 91 for acquiring a scout image of a scanning object; identifying anatomical feature points of the scanned object in the scout image; a determining unit 92 for determining an association relationship between the anatomical feature point and the scanning bed; and the recommending unit 93 is configured to recommend the scanning parameters and/or the image reconstruction parameters according to the association relationship. The identification and classification of the anatomical feature points can be realized through a deep learning model.
In this embodiment, after the obtaining unit 91 obtains the scout image, the determining unit 92 first identifies the anatomical feature points in the scout image, and then determines the scanning parameters and/or image reconstruction parameters during scanning based on the association relationship between the anatomical feature points and the scanning bed. In the related art, scan planning is performed based on a scout image, and a technician is required to manually set parameters of different body parts. In the embodiment, the corresponding scanning parameters and/or image reconstruction parameters can be recommended by identifying the anatomical feature points in the positioning image, so that the problems that the parameters of different body parts need to be manually set and then can be scanned in the related technology, time and labor are wasted, and the efficiency is low are solved, and the scanning efficiency of the PET system is improved.
Further, the determining unit 92 is further configured to obtain a body part label for each anatomical feature point; counting label information of body part labels of the anatomical feature points; classifying the anatomical feature points according to the label information, and determining a classification result; and determining the incidence relation between the anatomical feature points and the scanning bed according to the classification result so as to improve the classification accuracy of the anatomical feature points.
Specifically, the determining unit 92 determines a first corresponding relationship between the anatomical feature point and the body part of the scanning object according to the label information of the body part label of the anatomical feature point of the scanning object; determining a second corresponding relation between the body part of the scanning object and the scanning bed according to the first corresponding relation; and determining the incidence relation between the anatomical feature point and the scanning bed according to the second corresponding relation.
Further, the label information includes the number and the category of the body part labels and the proportion of each category to the total number, and the determining unit 92 determines the first corresponding relationship between the anatomical feature point and the body part to be scanned according to the proportion of each category of the body part labels to the total number in the label information.
In some of these embodiments, the determination unit 92 identifies key anatomical feature points among the anatomical feature points; determining the body part to which the key anatomical feature point belongs, and determining the association relationship between the anatomical feature point and the scanning bed according to the body part to which the key anatomical feature point belongs.
Further, the device also comprises a human-computer interaction unit, which is used for displaying a list of the scanning parameters and/or the image reconstruction parameters on a human-computer interaction interface; selecting and/or confirming the scanning parameters and/or the reconstruction parameters in the list; scanning and/or image reconstruction is performed based on the selection and/or validation results. The image reconstruction parameter list comprises a motion correction parameter and/or a motion monitoring parameter, and the interchangeability in the scanning process is enhanced.
Fig. 10 is a block diagram of a PET system according to an embodiment of the present application, and as shown in fig. 10, the system includes a scanner 1001 and a processor 1002: the scanner 1001 is used to acquire a scout image of a scanning object; the processor 1002 identifies anatomical feature points of the scanned object in the scout image; the processor 1002 determining an association between the anatomical feature point and a scanning bed; the processor 1002 recommends the scan parameters and/or image reconstruction parameters according to the association relationship. The identification and classification of the anatomical feature points can be realized through a deep learning model.
After the scanner 1001 obtains the scout image, the processor 1002 identifies the anatomical feature points in the scout image, and then determines the scanning parameters and/or image reconstruction parameters during scanning based on the association relationship between the anatomical feature points and the scanning bed. In the related art, scan planning is performed based on a scout image, and a technician is required to manually set parameters of different body parts. In the embodiment, the corresponding scanning parameters and/or image reconstruction parameters can be recommended by identifying the anatomical feature points in the positioning image, so that the problems that the parameters of different body parts need to be manually set and then can be scanned in the related technology, time and labor are wasted, and the efficiency is low are solved, and the scanning efficiency of the PET system is improved.
Further, the processor 1002 is further configured to obtain a body part label for each anatomical feature point; counting label information of body part labels of the anatomical feature points; classifying the anatomical feature points according to the label information, and determining a classification result; and determining the incidence relation between the anatomical feature points and the scanning bed according to the classification result so as to improve the classification accuracy of the anatomical feature points.
Specifically, the processor 1002 determines a first corresponding relationship between an anatomical feature point and a body part of a scanning object according to label information of a body part label of the anatomical feature point of the scanning object; determining a second corresponding relation between the body part of the scanning object and the scanning bed according to the first corresponding relation; and determining the incidence relation between the anatomical feature point and the scanning bed according to the second corresponding relation.
Further, the label information includes the number and the category of the body part labels and the proportion of each category to the total number, and the processor 1002 determines the first corresponding relationship between the anatomical feature point and the body part of the scanning object according to the proportion of each category of the body part labels to the total number in the label information.
In some of these embodiments, the processor 92 identifies key anatomical feature points among the anatomical feature points; determining the body part to which the key anatomical feature point belongs, and determining the association relationship between the anatomical feature point and the scanning bed according to the body part to which the key anatomical feature point belongs.
Further, the processor 1002 is further configured to display a list of scan parameters and/or image reconstruction parameters on a human-computer interface; selecting and/or confirming the scanning parameters and/or the reconstruction parameters in the list; scanning and/or image reconstruction is performed based on the selection and/or validation results. The image reconstruction parameter list comprises a motion correction parameter and/or a motion monitoring parameter, and the interchangeability in the scanning process is enhanced.
The above modules may be functional modules or program modules, and may be implemented by software or hardware. For a module implemented by hardware, the modules may be located in the same processor; or the modules can be respectively positioned in different processors in any combination.
The present embodiment also provides an electronic device comprising a memory having a computer program stored therein and a processor configured to execute the computer program to perform the steps of any of the above method embodiments.
Optionally, the electronic apparatus may further include a transmission device and an input/output device, wherein the transmission device is connected to the processor, and the input/output device is connected to the processor.
Optionally, in this embodiment, the processor may be configured to execute the following steps by a computer program:
s1, acquiring a positioning image of the scanning object;
s2, identifying the anatomical feature points of the scanned object in the scout image;
s3, determining the association relationship between the anatomical feature point and the scanning bed;
and S4, recommending the scanning parameters and/or the image reconstruction parameters according to the association relation.
It should be noted that, for specific examples in this embodiment, reference may be made to examples described in the foregoing embodiments and optional implementations, and details of this embodiment are not described herein again.
In addition, in combination with the PET system parameter recommendation method in the foregoing embodiment, the embodiment of the present application may provide a storage medium to implement. The storage medium having stored thereon a computer program; the computer program, when executed by a processor, implements any of the PET system parameter recommendation methods in the above embodiments.
The technical features of the embodiments described above may be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the embodiments described above are not described, but should be considered as being within the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. A PET system parameter recommendation method is characterized by comprising the following steps:
acquiring a scout image of a scanned object;
identifying anatomical feature points of the scanned object in the scout image;
determining an incidence relation between the anatomical feature point and a scanning bed;
and recommending scanning parameters and/or image reconstruction parameters according to the incidence relation.
2. The PET system parameter recommendation method of claim 1 wherein the determining an association between the anatomical feature point and a scanning bed comprises:
acquiring a body part label of each anatomical feature point;
counting label information of body part labels of the anatomical feature points;
classifying the anatomical feature points according to the label information, and determining a classification result;
and determining the association relationship between the anatomical feature point and the scanning bed according to the classification result.
3. The PET system parameter recommendation method according to claim 2, wherein the determining the association relationship between the anatomical feature point and the scanning bed according to the classification result comprises:
determining a first corresponding relation between the anatomical feature point and the body part of the scanning object according to label information of a body part label of the anatomical feature point of the scanning object;
determining a second corresponding relation between the body part of the scanning object and the scanning bed according to the first corresponding relation;
and determining the incidence relation between the anatomical feature point and the scanning bed according to the second corresponding relation.
4. The PET system parameter recommendation method according to claim 3, wherein the determining the first corresponding relationship of the anatomical feature point and the body part of the scanning object according to the label information of the body part label of the anatomical feature point of the scanning object comprises:
the label information comprises the number of body part labels, the categories and the proportion of each category to the total number;
and determining a first corresponding relation between the anatomical feature point and the body part of the scanning object according to the proportion of each category of the body part labels in the label information in the total number.
5. The PET system parameter recommendation method of claim 1 wherein the determining an association between the anatomical feature point and a scanning bed comprises:
identifying key anatomical feature points among the anatomical feature points;
determining the body part to which the key anatomical feature point belongs, and determining the association relationship between the anatomical feature point and the scanning bed according to the body part to which the key anatomical feature point belongs.
6. The PET system parameter recommendation method of claim 1, further comprising:
displaying a list of the scanning parameters and/or image reconstruction parameters on a human-computer interaction interface;
selecting and/or confirming scan parameters and/or reconstruction parameters in the list;
scanning and/or image reconstruction is performed based on the selection and/or validation results.
7. The PET system parameter recommendation method of claim 6 wherein the list of image reconstruction parameters includes motion correction parameters.
8. The PET system parameter recommendation method of claim 6 wherein the list of scan parameters includes motion monitoring parameters.
9. The PET system parameter recommendation method of claim 1 wherein the anatomical feature points of the scanned object in the scout image are identified or classified according to a deep learning model.
10. A PET system comprising a scanner and a processor:
the scanner is used for acquiring a positioning image of a scanned object;
the processor identifying anatomical feature points of the scanned object in the scout image;
the processor determining an association between the anatomical feature point and a scanning bed;
and the processor recommends scanning parameters and/or image reconstruction parameters according to the incidence relation.
CN202110952756.6A 2021-08-19 2021-08-19 PET system parameter recommendation method and PET system Pending CN113520443A (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
CN202110952756.6A CN113520443A (en) 2021-08-19 2021-08-19 PET system parameter recommendation method and PET system
EP22857918.1A EP4329624A4 (en) 2021-08-19 2022-08-19 Systems and methods for medical imaging
PCT/CN2022/113544 WO2023020609A1 (en) 2021-08-19 2022-08-19 Systems and methods for medical imaging
US18/434,934 US20240242400A1 (en) 2021-08-19 2024-02-07 Systems and methods for medical imaging

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110952756.6A CN113520443A (en) 2021-08-19 2021-08-19 PET system parameter recommendation method and PET system

Publications (1)

Publication Number Publication Date
CN113520443A true CN113520443A (en) 2021-10-22

Family

ID=78091261

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110952756.6A Pending CN113520443A (en) 2021-08-19 2021-08-19 PET system parameter recommendation method and PET system

Country Status (1)

Country Link
CN (1) CN113520443A (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140148684A1 (en) * 2012-11-27 2014-05-29 General Electric Company PET Acquisition Scheduling Based on MR SCOUT Images
US20150119703A1 (en) * 2013-10-24 2015-04-30 Siemens Medical Solutions Usa, Inc. Anatomic Range Planning in Positron Emission Tomography
CN104605881A (en) * 2014-12-31 2015-05-13 沈阳东软医疗系统有限公司 Parameter optimizing method and medical equipment
US20170084057A1 (en) * 2015-09-17 2017-03-23 Shenyang Neusoft Medical Systems Co., Ltd. Determining pet scanning time
CN107403287A (en) * 2017-08-11 2017-11-28 上海联影医疗科技有限公司 Scan orientation method, apparatus, system and storage medium
CN109381212A (en) * 2018-09-27 2019-02-26 上海联影医疗科技有限公司 A kind of image formation control method and system
CN109567852A (en) * 2019-01-15 2019-04-05 上海联影医疗科技有限公司 The determination method of scanning range, the acquisition methods of medical image, device and equipment
CN110301928A (en) * 2019-07-04 2019-10-08 东软医疗系统股份有限公司 Rebuild the method, apparatus and system of PET image
CN110956633A (en) * 2020-02-26 2020-04-03 南京安科医疗科技有限公司 Rapid CT scanning method and system based on virtual stereotactic image
CN112971824A (en) * 2021-02-08 2021-06-18 上海联影医疗科技股份有限公司 PET dynamic image scanning method, device and computer equipment

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140148684A1 (en) * 2012-11-27 2014-05-29 General Electric Company PET Acquisition Scheduling Based on MR SCOUT Images
US20150119703A1 (en) * 2013-10-24 2015-04-30 Siemens Medical Solutions Usa, Inc. Anatomic Range Planning in Positron Emission Tomography
US20180028140A1 (en) * 2013-10-24 2018-02-01 Siemens Medical Solutions Usa, Inc. Anatomic Range Planning in Positron Emission Tomography
CN104605881A (en) * 2014-12-31 2015-05-13 沈阳东软医疗系统有限公司 Parameter optimizing method and medical equipment
US20170084057A1 (en) * 2015-09-17 2017-03-23 Shenyang Neusoft Medical Systems Co., Ltd. Determining pet scanning time
CN107403287A (en) * 2017-08-11 2017-11-28 上海联影医疗科技有限公司 Scan orientation method, apparatus, system and storage medium
CN109381212A (en) * 2018-09-27 2019-02-26 上海联影医疗科技有限公司 A kind of image formation control method and system
CN109567852A (en) * 2019-01-15 2019-04-05 上海联影医疗科技有限公司 The determination method of scanning range, the acquisition methods of medical image, device and equipment
CN110301928A (en) * 2019-07-04 2019-10-08 东软医疗系统股份有限公司 Rebuild the method, apparatus and system of PET image
CN110956633A (en) * 2020-02-26 2020-04-03 南京安科医疗科技有限公司 Rapid CT scanning method and system based on virtual stereotactic image
CN112971824A (en) * 2021-02-08 2021-06-18 上海联影医疗科技股份有限公司 PET dynamic image scanning method, device and computer equipment

Similar Documents

Publication Publication Date Title
CN107392897B (en) Organ contour acquisition method, imaging apparatus, radiotherapy planning system, and storage medium
CN109567843A (en) A kind of image scanning automatic positioning method, device, equipment and medium
CN110960241A (en) Method and device for determining scanning parameters of medical image scanning and computer equipment
CN106725570A (en) Imaging method and system
US10593041B1 (en) Methods and apparatus for the application of machine learning to radiographic images of animals
CN111493909A (en) Medical image scanning method, apparatus, computer device and storage medium
CN111904379B (en) Scanning method and device for multi-mode medical equipment
CN112150574A (en) Method, system and device for automatically correcting image artifacts and storage medium
CN111493908A (en) Medical image scanning method, apparatus, computer device and storage medium
CN109924993A (en) Image scanning agreement automatic planning, device, electronic equipment and storage medium
CN212037549U (en) Medical imaging system
CN107798711B (en) Medical imaging scanning method and system
CN107252353A (en) The control method and medical imaging devices of medical imaging devices
CN114943714A (en) Medical image processing system, medical image processing apparatus, electronic device, and storage medium
CN112037147A (en) Medical image noise reduction method and device
CN111462139A (en) Medical image display method, medical image display device, computer equipment and readable storage medium
CN114299019A (en) Scanning method, system and device for nuclear medicine equipment
CN107256344A (en) Data processing method, device and radiotherapy management system
CN111402356A (en) Parameter imaging input function extraction method and device and computer equipment
CN114202516A (en) Foreign matter detection method and device, electronic equipment and storage medium
CN113344926A (en) Method, device, server and storage medium for recognizing biliary-pancreatic ultrasonic image
CN113989231A (en) Method and device for determining kinetic parameters, computer equipment and storage medium
CN106650734A (en) Method for identifying sub areas of locating image, method and device for displaying medical images
CN110301928B (en) Method, device and system for reconstructing PET (positron emission tomography) image
CN113520443A (en) PET system parameter recommendation method and PET system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination