CN113971754A - Image data acquisition method and device, computing equipment and storage medium - Google Patents

Image data acquisition method and device, computing equipment and storage medium Download PDF

Info

Publication number
CN113971754A
CN113971754A CN202111250313.9A CN202111250313A CN113971754A CN 113971754 A CN113971754 A CN 113971754A CN 202111250313 A CN202111250313 A CN 202111250313A CN 113971754 A CN113971754 A CN 113971754A
Authority
CN
China
Prior art keywords
image data
scan
feature
determining
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111250313.9A
Other languages
Chinese (zh)
Inventor
毛新生
郑超
阳光
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shukun Beijing Network Technology Co Ltd
Original Assignee
Shukun Beijing Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shukun Beijing Network Technology Co Ltd filed Critical Shukun Beijing Network Technology Co Ltd
Priority to CN202111250313.9A priority Critical patent/CN113971754A/en
Publication of CN113971754A publication Critical patent/CN113971754A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0033Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0033Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room
    • A61B5/0037Performing a preliminary scan, e.g. a prescan for identifying a region of interest
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/05Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves 
    • A61B5/055Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves  involving electronic [EMR] or nuclear [NMR] magnetic resonance, e.g. magnetic resonance imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/02Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/025Tomosynthesis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/02Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computed tomography [CT]
    • A61B6/032Transmission computed tomography [CT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/02Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computed tomography [CT]
    • A61B6/037Emission tomography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5211Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
    • A61B6/5229Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/58Testing, adjusting or calibrating thereof
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5238Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/58Testing, adjusting or calibrating the diagnostic device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • General Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Biomedical Technology (AREA)
  • Public Health (AREA)
  • Animal Behavior & Ethology (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Pathology (AREA)
  • Biophysics (AREA)
  • Veterinary Medicine (AREA)
  • Heart & Thoracic Surgery (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Optics & Photonics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Quality & Reliability (AREA)
  • Pulmonology (AREA)
  • General Physics & Mathematics (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

Provided are an image data acquisition method, an image data acquisition device, a computing device and a storage medium. The method can comprise the following steps: acquiring first image data according to first scanning parameters determined based on scanning configuration data from a user; in response to determining that the first image data meets the secondary acquisition condition, acquiring second image data according to the second scanning parameters; and generating final image data based on at least the second image data.

Description

Image data acquisition method and device, computing equipment and storage medium
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to an image data acquisition method, an image data acquisition device, a computing device, and a storage medium.
Background
In the field of image acquisition, particularly medical image scanning, the situation that image data acquisition fails and needs to be acquired again due to poor image quality or failure in clear description of target features to be acquired often occurs. For this reason, it is desirable to obtain a method capable of automatically supplementing acquired data to obtain more accurate image data.
Disclosure of Invention
According to an aspect of the present disclosure, there is provided an image data acquisition method including: acquiring first image data according to first scanning parameters determined based on scanning configuration data from a user; in response to determining that the first image data meets a secondary acquisition condition, acquiring second image data according to a second scanning parameter; and generating final image data based on at least the second image data.
According to another aspect of the present disclosure, there is provided an image data acquisition apparatus including: a first acquisition unit configured to acquire first image data according to a first scanning parameter determined based on scanning configuration data from a user; the second acquisition unit is used for responding to the fact that the first image data meet the secondary acquisition condition and acquiring second image data according to second scanning parameters; and a result generation unit for generating final image data based on at least the second image data.
According to another aspect of the present disclosure, there is provided a computing device comprising: a memory, a processor, and a computer program stored on the memory, wherein the processor is configured to execute the computer program to implement the steps of a method according to an embodiment of the disclosure.
According to yet another aspect of the present disclosure, a non-transitory computer-readable storage medium is provided, having a computer program stored thereon, wherein the computer program, when executed by a processor, implements the steps of a method according to an embodiment of the present disclosure.
According to yet another aspect of the present disclosure, a computer program product is provided, comprising a computer program, wherein the computer program, when executed by a processor, implements the steps of a method according to an embodiment of the present disclosure.
These and other aspects of the disclosure will be apparent from and elucidated with reference to the embodiments described hereinafter.
Drawings
Further details, features and advantages of the disclosure are disclosed in the following description of exemplary embodiments, taken in conjunction with the accompanying drawings, in which:
FIG. 1 is a schematic diagram illustrating an example system in which various methods described herein may be implemented, according to an example embodiment;
FIG. 2 is a flow chart illustrating an image data acquisition method according to an exemplary embodiment;
FIG. 3 is a flow chart illustrating an image data acquisition method according to a further exemplary embodiment;
FIG. 4 is a schematic block diagram illustrating an image data acquisition apparatus according to an exemplary embodiment;
FIG. 5 is a block diagram illustrating an exemplary computer device that can be applied to the exemplary embodiments.
Detailed Description
In the present disclosure, unless otherwise specified, the use of the terms "first", "second", etc. to describe various elements is not intended to limit the positional relationship, the timing relationship, or the importance relationship of the elements, and such terms are used only to distinguish one element from another. In some examples, a first element and a second element may refer to the same instance of the element, and in some cases, based on the context, they may also refer to different instances.
The terminology used in the description of the various described examples in this disclosure is for the purpose of describing particular examples only and is not intended to be limiting. Unless the context clearly indicates otherwise, if the number of elements is not specifically limited, the elements may be one or more. As used herein, the term "plurality" means two or more, and the term "based on" should be interpreted as "based, at least in part, on". Further, the terms "and/or" and at least one of "… …" encompass any and all possible combinations of the listed items.
Exemplary embodiments of the present disclosure are described in detail below with reference to the accompanying drawings.
Fig. 1 is a schematic diagram illustrating an example system 100 in which various methods described herein may be implemented, according to an example embodiment.
Referring to fig. 1, the system 100 includes a client device 110, a server 120, and a network 130 communicatively coupling the client device 110 and the server 120.
The client device 110 includes a display 114 and a client Application (APP)112 displayable via the display 114. The client application 112 may be an application that needs to be downloaded and installed before running or an applet (liteapp) that is a lightweight application. In the case where the client application 112 is an application program that needs to be downloaded and installed before running, the client application 112 may be installed on the client device 110 in advance and activated. In the case where the client application 112 is an applet, the user 102 can run the client application 112 directly on the client device 110 without installing the client application 112 by searching the client application 112 in a host application (e.g., by the name of the client application 112, etc.) or by scanning a graphical code (e.g., barcode, two-dimensional code, etc.) of the client application 112, etc. In some embodiments, client device 110 may be any type of mobile computer device, including a mobile computer, a mobile phone, a wearable computer device (e.g., a smart watch, a head-mounted device, including smart glasses, etc.), or other type of mobile device. In some embodiments, client device 110 may alternatively be a stationary computer device, such as a desktop, server computer, or other type of stationary computer device. In some optional embodiments, the client device 110 may also be or may include a medical image printing device.
The server 120 is typically a server deployed by an Internet Service Provider (ISP) or Internet Content Provider (ICP). Server 120 may represent a single server, a cluster of multiple servers, a distributed system, or a cloud server providing an underlying cloud service (such as cloud database, cloud computing, cloud storage, cloud communications). It will be understood that although the server 120 is shown in fig. 1 as communicating with only one client device 110, the server 120 may provide background services for multiple client devices simultaneously.
Examples of network 130 include a Local Area Network (LAN), a Wide Area Network (WAN), a Personal Area Network (PAN), and/or a combination of communication networks such as the Internet. The network 130 may be a wired or wireless network. In some embodiments, data exchanged over network 130 is processed using techniques and/or formats including hypertext markup language (HTML), extensible markup language (XML), and the like. In addition, all or some of the links may also be encrypted using encryption techniques such as Secure Sockets Layer (SSL), Transport Layer Security (TLS), Virtual Private Network (VPN), internet protocol security (IPsec), and so on. In some embodiments, custom and/or dedicated data communication techniques may also be used in place of or in addition to the data communication techniques described above.
The system 100 may also include an image acquisition device 140. In some embodiments, the image acquisition device 140 shown in fig. 1 may be a medical scanning device, including but not limited to Positron Emission Tomography (PET) computer Imaging systems (Positron emission tomography 10), Positron Emission Tomography (PET) computer Imaging systems (Positron emission tomography 10 with Computerized tomograph10, PET/CT), Single photon emission computed tomography (Single photon emission computed tomography 10 with Computerized tomograph10, SPECT/CT), computed tomography (computed tomography 10, CT), Medical ultrasound computer Imaging systems (Medical ultrasound 10), Nuclear Magnetic Resonance Imaging systems (Nuclear Magnetic Resonance Imaging (NMRI), Magnetic Resonance Imaging (Magnetic Resonance Imaging, MRI), contrast Imaging (contrast Imaging, 10), Digital radiography Imaging (DR Imaging, 10, DR Imaging (Digital radiography, DR Imaging, CT), and the like. For example, the image acquisition device 140 may include a digital subtraction angiography scanner, a magnetic resonance angiography scanner, a tomography scanner, a positron emission computed tomography scanner, a single photon emission computed tomography scanner, a medical ultrasound examination device, a magnetic resonance imaging scanner, a digital radiography scanner, or the like. The image acquisition device 140 may be coupled to a server (e.g., the server 120 of fig. 1 or a separate server of an imaging system not shown in the figures) to perform processing of image data, including, but not limited to, conversion of scan data (e.g., into a sequence of medical images), compression, pixel correction, three-dimensional reconstruction, and the like.
Image capture device 140 may be connected with client device 110, for example, over network 130, or otherwise directly connected to the client device to communicate with the client device.
Optionally, the system may also include a smart computing device or computing card 150. The image capture device 140 may include or be connected (e.g., removably connected) to such a computing card 150 or the like. As one example, the computing card 150 may implement processing of image data including, but not limited to, conversion, compression, pixel correction, reconstruction, and the like. As another example, the computing card 150 may implement an image capture method according to embodiments of the present disclosure.
The system may also include other parts not shown, such as a data store. The data storage part may be a database, a data storage library or other form of one or more devices for data storage, may be a conventional database, and may also include a cloud database, a distributed database, and the like. For example, direct image data formed by the image acquisition device 140 or medical image sequences or three-dimensional image data obtained through image processing, etc. may be stored in the data storage for retrieval from the data storage by the subsequent server 120 and the client device 110. In addition, the image capturing device 140 may also directly provide the image data, or the medical image sequence or the three-dimensional image data obtained through image processing, to the server 120 or the client device 110.
The user may use the client device 110 to view captured images or movies, including preliminary image data or analyzed images, etc., view analysis results, interact with captured images or analysis results, input capture instructions, configuration data, and so forth. Client device 110 may send configuration data, instructions, or other information to image capture device 140 to control capture and data processing, etc. of the image capture device.
FIG. 2 is a flowchart illustrating an image data acquisition method 200 according to an exemplary embodiment. The method 200 may be performed at a client device (e.g., the client device 110 shown in fig. 1), i.e., the subject of execution of the various steps of the method 200 may be the client device 110 shown in fig. 1. In some embodiments, method 200 may be performed at a server (e.g., server 120 shown in fig. 1). In some embodiments, the method 200 may be performed on devices other than the client device 110 (e.g., the capture device 140, the computing apparatus 150). In some embodiments, the method 200 may be performed by a client device (e.g., client device 110 and other devices 140, 150, etc.) in combination with a server (e.g., server 120).
Hereinafter, the steps of the method 200 are described in detail by taking the execution subject as the client device 110 as an example.
Referring to fig. 2, at step 210, first image data is acquired according to first scan parameters determined based on scan configuration data from a user.
At step 220, in response to determining that the first image data satisfies the secondary acquisition condition, second image data is acquired according to the second scan parameters.
At step 230, final image data is generated based at least on the second image data.
In this way, the acquired data can be automatically analyzed and supplemented, and more accurate image data can be obtained based on the complementarily acquired image.
By the method 200, the acquisition target can be acquired by using the second scanning parameter number to acquire second image data by determining that the first image data meets the secondary acquisition condition; and generating an acquisition result based on at least the second image data.
The second scan parameter may for example be a scan parameter generated based on the first scan parameter, e.g. a scan parameter generated for a certain region of the first image data in response to determining that the region is problematic or an acquisition angle parameter generated for a certain region of the first image data or a target feature in response to determining that the certain feature is present for the region; the second scan parameter may also be the same scan parameter as the first scan parameter, e.g. a simple repeated acquisition is performed; alternatively, the second scan parameter may be a predetermined parameter, e.g. once certain features are detected, a reacquisition using another predetermined angle or predetermined angle range and/or using another predetermined resolution, etc. may be generated. It is to be understood that the above are examples and the disclosure is not limited thereto.
Although the operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, nor that all illustrated operations be performed, to achieve desirable results.
According to some embodiments, determining that the first image data satisfies the secondary acquisition condition may include: determining that the first image data includes a target feature corresponding to the scan configuration data, and generating the final image data based at least on the second image data may include: final image data is obtained based on the first image data and the second image data.
In such an embodiment, the secondary acquisition condition may include the presence of a target feature, such as an object to be detected, a lesion feature, etc., and thus the secondary acquisition may be more finely sampled (or may be of the same resolution) for the target feature, and the results of the two acquisitions may be combined. Therefore, more accurate data (for example, data of more pixel points or more channels and the like) can be generated for the target feature to generate more accurate and effective image data, so that the time for judging the target feature by a possible person is reduced, and meanwhile, the scanning resource is avoided being consumed under the condition that no potential target feature exists (or in the area without the potential target feature).
According to some embodiments, further comprising generating the second scan parameter in response to determining that the first image data includes the target feature corresponding to the scan configuration data. The second scan parameter may include a scan interval corresponding to the target feature. For example, the first scan parameter may include a first scan interval, and thus, the generated (second) scan interval may be smaller than the first scan interval, e.g., include only a portion of the range corresponding to the target feature, e.g., pixel points in and around the region where the target feature is located.
In further embodiments, the second scan parameters may also include a different or partially different scan order, scan direction, etc. than the first scan parameters. For example, taking ultrasound scanning as an example, where the first scan parameter is a frontal scan, the second scan parameter may comprise a lateral scan in a different direction. The second scanning parameters may also include different or more sensors, more scanning channels, etc.
According to some embodiments, the scan configuration data includes an analysis policy, and determining that the first region of the first image data has a target feature corresponding to the scan configuration data may include: determining at least one feature to be detected based on an analysis strategy; and determining that a target feature matching the to-be-detected feature of the at least one to-be-detected feature exists in the image data. With such an embodiment, the image can be parsed according to the analysis strategy in the configuration data, matching in terms of lesion features, and fine sampling of features required based on the analysis strategy. For example, the feature to be detected may be a feature corresponding to a blood vessel or plaque. In response to the analysis determining the presence of plaque (or suspected plaque), the shots may be taken from different angles to obtain more comprehensive data about possible lesion features for reconstruction and measurement, etc.
According to some embodiments, the method 200 may further include, after generating the final image data, performing an analysis process on the final image data based on the analysis policy to obtain an image analysis result, determining that a target feature matching to the to-be-detected feature of the at least one to-be-detected feature exists in the image data by performing a pre-analysis process on the first image data. The pre-resolution process may be based on lower accuracy requirements than the resolution process, e.g. lower lesion detection rate requirements, thus enabling faster analysis of the lesion area, enabling a secondary scan in near real time. For example, the patient does not need to wait too long before scanning the instrument or find a rescan necessary after the patient leaves, but can be performed almost instantaneously.
According to some further embodiments, the final image data may have a resolution higher than the resolution of the first image data, at least in the vicinity of the target feature. According to some embodiments, generating the final image data may include generating coordinate correction data for at least one of the first image data and the second image data based on a first feature in the first image data and a second feature in the second image data corresponding to the first feature; and superimposing the first image data and the second image data based on the coordinate correction data to obtain final image data. In such a case, the resolution can be increased by using the systematic offset or the offset of the target object (which is ubiquitous), so that the data of two times can be combined without greatly increasing the precision or resolution of the secondary acquisition, even without increasing the precision or resolution of the secondary acquisition, to obtain higher resolution and more clearly present the target feature (e.g., a lesion target or a lesion region). For example, the coordinate correction data may be data indicating that each data point of the first image (or the second image, here by way of example only) should be moved (+1, -2) to align with the second image. The coordinate correction data may be a correction value that is individual or different for each pixel point, or may be a correction value or a correction function (e.g., a function of a one-to-one correspondence) for the entire image. It is also possible to compare the first image and the second image, respectively, with a predetermined reference, and thus to generate one correction data for each image.
Without changing the acquisition resolution of the first and second images (assuming lengths both l), and as a very simplified, merely illustrative example, it is assumed that the first image comprises four sample points, respectively denoted a1(0,0), a2(0, l), A3(l,0) and a4(l, l) with respect to its acquisition coordinate system; and the first image also includes four sample points, referenced respectively as B1(0,0), B2(0, l), B3(l,0) and B4(l, l) with respect to its acquisition coordinate system. Through coordinate correction analysis, it is found that the second image has a global translation (m, n) with respect to the first image due to acquisition errors, scanning errors, or human body movement, and if the coordinates of the first image are mapped to the second image, the four data points can be respectively denoted as a1 '(m, n), a 2' (m, l + n), A3 '(l + m, n), and a 4' (l + m, l + n). Thereby, eight pixel points a1 '-a 4' and B1-B4 are obtained under the coordinate reference of the second image, and a finer resolution can be obtained thereby. This effect can be achieved particularly advantageously without adjustment of the acquisition resolution. It will be appreciated that the above description is merely an example, for example each image may comprise many more data points, the coordinates of the second image may be mapped to the first image or both to another reference image, the coordinate correction relationship may comprise not only translation but also rotation, etc. For example, one or more feature points (e.g., clear human features, obvious human body parts, certain specific patterns or light spots) can be respectively selected from the two images, and the position relationship of the coordinates in the two images can be calculated or estimated by using the respective positions of the one and the plurality of feature points in the two images. It is to be understood that the present disclosure is not limited thereto.
According to some embodiments, the second scan parameter further comprises a scan precision greater than the scan precision indicated in the first scan parameter. For example, enhanced resolution for the target feature may be included, including global resolution or local resolution, or increased sensor sensitivity, including global sensor sensitivity or local sensor sensitivity. Therefore, the target characteristics can be scanned more finely and specifically.
According to some embodiments, the second scan parameter may include at least one scan angle different from the scan angle included in the first scan parameter, and the final image data may include three-dimensional reconstruction data for the target feature. The at least one angle may be one or more discrete scan angles or may be a range of scan angles. Thereby, a multi-angle acquisition of target features, such as lesion areas or lesions, can be achieved.
According to some embodiments, determining that the first image data satisfies the secondary acquisition condition may include: determining that a defect feature is present in the first image data, and generating final image data based at least on the second image data may include: final image data is generated by replacing at least a portion of the first image data with the second image data. In such an embodiment, in the case where the image has a defect, re-shooting can be performed, and replacement or partial replacement is performed to obtain image data free of defects or alleviating defects. In such embodiments, determining the presence of a defect feature in the first image data may include pre-analyzing or pre-reconstructing the image data in terms of image quality, and the pre-analyzing and pre-reconstructing may take on lower precision or accuracy requirements than the actual analyzing and reconstructing process. For example, the quality evaluation model may be employed to evaluate the image reconstruction quality, and the quality evaluation model may be trained in advance to evaluate the partially defective, frightened image as being of poor reconstruction quality. The reconstruction quality may also include image integrity or deformity. As one example, the integrity may be evaluated by extracting or decimating the integrity of the centerline, a particular line, and/or a particular region. As another example, keypoints in the image, such as key human body parts or features, may be referenced, and the presence of a incomplete or incomplete scan range may be determined based on the absence of keypoints. Determining the presence of defective features in the first image data may additionally or alternatively include evaluating image accuracy, including but not limited to analyzing image size and target edge contrast to determine if image accuracy meets requirements, and the like. It will be appreciated that the above are merely examples, and that the disclosure is not limited thereto.
According to some embodiments, the defect feature may include a global type defect feature and a local type defect feature, and the second scan parameter includes a scan interval corresponding to the defect feature. For example, for artifact defects or overall blurring, it is necessary to completely replace the first image data with the second image data; for defects such as local missing or local unsharpness, local replacement may be performed on the first image data.
According to some embodiments, the method 200 may further include the step of controlling the terminal to output a warning and stop generating final image data in response to determining that a defective feature is present in the second image data. By the method, when defects which cannot be overcome by secondary acquisition are found, for example, a fuzzy pattern or an unexpected pattern exists in a certain area, the defects are considered to be possibly from static interference, such as metal which is not found on a patient, or other system errors, and manual intervention can be prompted, so that interference characteristics can be avoided, more accurate images can be generated, and computing resources can be saved.
According to some embodiments, the method 200 may further comprise: in response to determining that the defect feature satisfies the human intervention criteria, controlling the terminal to output a warning and suspend acquiring the second image data; and in response to receiving an instruction from a user, performing the step of acquiring the second image data. The human intervention criteria may be some common false morphologies that cannot or are difficult to overcome by the secondary acquisition, for example, visualization targets of unexpected morphologies may indicate contrast agent time or dose mismatch, need to re-apply contrast agent, and certain features may indicate the presence of external disturbances, for example, a scattering-like highlight pattern indicating the presence of metal objects, etc.
An image data acquisition method 300 according to a further embodiment of the present disclosure is described below in connection with fig. 3.
At step 310, acquired image data, such as first image data acquired at first scan parameters, is acquired. The first scan parameter may be determined based on scan configuration data from a user. Determining the first scan parameter based on scan configuration data from the user may include configuring all or most of the scan parameters by the user, such as scan type, scan interval or position range, scan angle or scan angle range, scan time or duration, resolution or accuracy, and so forth. Determining the first scan parameters based on the scan configuration data from the user may also include automatically generating scan parameters, such as scan types, scan angles, scan intervals, scan accuracies, etc., corresponding to one or more of the corresponding scan objects, analysis strategies based on other instructional information configured by the user for the scan or analysis strategy, such as indicating that the scan object is a lung region, that the analysis object is some possible condition or lesion, such as pneumonia, that the analysis strategy is a lesion determined based on certain specific signs, such as scatter pattern signs. It is to be understood that the present disclosure is not limited thereto.
At step 320, a preliminary parsing process is performed on the image data. The preliminary analysis or pre-analysis may include pre-analysis in terms of both lesion features and image quality, and may take lower accuracy requirements, such as lower detection rate and reconstruction rate requirements, to reduce computation time and hence latency, achieving a balance of timeliness and accuracy. In an alternative embodiment, in case a lesion feature, such as a lesion, is detected, the pre-analysis may further comprise analyzing lesion extent, size, etc., which may be achieved, for example, using a pre-trained lesion detection and segmentation model.
At step 330, second scan parameters are generated based on the pre-analysis results. Step 330 may include one or both of making a determination based on an analysis strategy or analysis objective and making a determination whether image quality is questionable. The determination based on the analysis strategy or the analysis objective may include determining whether there are features in the acquired image data that match expected lesion features. Making the determination in terms of image quality may include whether the image has artifacts, missing, blurry, interference patterns, does not meet reconstruction criteria, or has other quality issues. In some alternative embodiments, the analysis and determination of the two aspects may be performed in combination. For example, in some embodiments, a lesion region may be identified by parsing based on an analysis strategy, and then a pre-trained artifact classification model is used to identify lesion artifacts, or the image quality of the lesion region is further analyzed.
Generating the second scan parameters based on the pre-analysis results may include generating and merging parameters for secondary acquisitions for image quality issues and analysis targets, respectively, or generating the second scan parameters based on the pre-analysis results may include generating secondary acquisition parameters for combined analysis results. For example, continuing with the example above, parameters for increasing resolution, supplementing missing portions, adjusting artifacts may be generated for a lesion region based on the lesion artifact or poor quality image analysis of the lesion region. The secondary acquisition may be a global or local secondary acquisition, the data of the secondary acquisition for image quality problems may be used to replace the previous portions of questionable quality, and the data of the secondary acquisition based on the determination of the analysis strategy or the analysis target may be used to be combined or merged with the first image data.
Other relationships between image quality issues and associated secondary scan parameters have been previously described or may be other parameter adjustments as will occur to those of skill in the art. For example, when an image has artifacts, parameters similar or identical to the first scan parameters may be generated for the secondary acquisition (e.g., based on the same parameters by default, or when the artifact morphology indicates artifacts that may be a problem with acquiring the object, such as an acquired object movement-induced artifact, based on the same parameters), or some or all of the first scan parameters may be adjusted for the secondary acquisition according to the artifact morphology (e.g., the artifact morphology indicates artifacts that may be an inappropriate system parameter-induced artifact). Similarly, when there is a missing or other local defect in the image, parameters similar to or the same as the first scan parameters may be generated for the secondary acquisition or some or all of the first scan parameters may be adjusted for the secondary acquisition according to the characteristics of the missing or defect. It is to be understood that the present disclosure is not limited thereto, and those skilled in the art will appreciate that other ways of generating scan parameters based on image quality issues are also applicable to the methods of the embodiments of the present disclosure. Furthermore, as already mentioned above, under certain specific defects, such as an unintended form of the development target or the presence of a feature matching the presence of an interfering signal, an incorrect developer time or the like may be indicated or an external interfering object sign may be present, at which time a warning may be output to alert human intervention.
Similarly, the manner in which the parameters are generated based on the target features has been previously described, or may be other parameter adjustments as will occur to those of skill in the art. For example, when the image indicates the presence of a feature of a suspected detection object or a target lesion object in a certain region, similar or identical parameters to the first scan parameters may be generated for a secondary acquisition (e.g., based on the same parameters by default, or based on substantially the same or similar parameters when the feature morphology indicates a target feature suitable for two-dimensional viewing without the need for three-dimensional reconstruction); for another example, parameters having increased resolution or sensor sensitivity (e.g., which may be global or local resolution or sensitivity) compared to the first scan parameters may be generated for secondary acquisition (e.g., the second scan parameters assume increased resolution or sensitivity by default, or when the feature morphology indicates certain target features that require special refinement of the observation); as another example, parameters may be generated that have a different angle or angles, a wider range of angles, or a larger interval of scan positions than the first scan parameters (e.g., angles or intervals adjusted by default to enable more comprehensive acquisition of the target feature and its surroundings, or when the feature morphology indicates certain target features that require particularly refined viewing or often require three-dimensional reconstruction, second scan parameters may be generated for the target feature that include a different angle, a wider range of angles, or a wider range of translations). Alternatively, in the case of local acquisition, the second scan parameters may comprise only a small scan location interval to perform a secondary acquisition only on the target feature-related region. It will be appreciated that the above examples may be combined with each other. As one non-limiting example, based on analysis of a particular target feature, a secondary parameter may be generated that has increased local sensitivity and that has substantially the same scan range as the first scan parameter. It is to be understood that the present disclosure is not limited thereto, and those skilled in the art will appreciate that other ways of generating scan parameters based on image features are also applicable to the methods of the embodiments of the present disclosure.
At step 340, a second acquisition is performed to obtain second image data based on the parameters generated in the previous step, i.e., the second scan parameters. It will be appreciated that although referred to herein as a "secondary acquisition," this does not mean that only a single acquisition procedure (e.g., an acquisition procedure in which the scanning probe is translated once for a scan) is performed. Conversely, the second acquisition based on the second scan parameters may include a single scan in general or may be multiple scans. As one non-limiting example, where pre-analysis indicates that there are missing portions in the image and that there are target features that require fine scanning, in such a secondary acquisition process, multiple back and forth scans of the target that change angle may be included, and the previously missing portions of image data have been acquired in addition during the multiple scans; alternatively, the secondary acquisition may include multiple back and forth scans of the target with varying angles and another scan performed to make up for missing or blurred image portions. It will be appreciated that the above is merely an example and that the disclosure is not so limited.
At step 350, final image data may be generated based on the second image data obtained in step 340 and optionally based on the first image data. Various methods of generating image data, including superposition, combination, partial or total replacement, three-dimensional reconstruction, etc., have been described in the foregoing, and will not be described in detail here. It will be appreciated that the final image data may be two-dimensional image data, or may be or include three-dimensional image data. The final image data may be sent directly to the terminal device for use by the user. The final image data may also be subjected to other analysis processes (e.g. final analysis process of whether a lesion feature or lesion is present, which may be based on higher accuracy requirements than pre-analysis), and only the analysis results generated by such analysis are sent to the terminal device without the need to send the final image data; alternatively, the analysis result may be transmitted to the terminal device together with the final image data. It is to be understood that the present disclosure is not limited thereto.
Although the operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, nor that all illustrated operations be performed, to achieve desirable results. For example, where preliminary parsing of the image includes preliminary parsing in terms of image quality and target feature, respectively, the preliminary parsing in terms of image quality may be performed before, after, or both concurrently or even in combination. For another example, one of the preliminary interpretation in terms of image quality and the preliminary interpretation in terms of target features may even be omitted, i.e. the preliminary interpretation may comprise an analysis of only one of the image quality and the target features. It is to be understood that the present disclosure is not limited thereto. Further, it is understood that features or embodiments described with respect to method 300 may also be applicable to (e.g., in addition to or in place of) features or embodiments described with respect to method 200, and such combinations and substitutions would also fall within the scope of the present disclosure.
Fig. 4 is a schematic block diagram illustrating an image data acquisition apparatus 400 according to an exemplary embodiment. The image data acquisition apparatus 400 may include a first acquisition unit 410, a second acquisition unit 420, and a result generation unit 430. The first acquisition unit 410 is configured to acquire first image data according to first scan parameters determined based on scan configuration data from a user. The second obtaining unit 420 is configured to, in response to determining that the first image data satisfies the secondary acquisition condition, obtain second image data according to a second scanning parameter. The result generation unit 430 is configured to generate final image data based on at least the second image data.
It should be understood that the various modules of the apparatus 400 shown in fig. 4 may correspond to the various steps in the method 200 described with reference to fig. 2. Thus, the operations, features and advantages described above with respect to the method 200 are equally applicable to the apparatus 400 and the modules comprised thereby. Certain operations, features and advantages may not be described in detail herein for the sake of brevity.
According to an embodiment of the present disclosure, there is also disclosed a computing device comprising a memory, a processor and a computer program stored on the memory, wherein the processor is configured to execute the computer program to implement the steps of the image data acquisition method according to an embodiment of the present disclosure and its variants.
According to an embodiment of the present disclosure, a non-transitory computer readable storage medium is also disclosed, having a computer program stored thereon, wherein the computer program, when executed by a processor, implements the steps of the image data acquisition method and its variants according to an embodiment of the present disclosure.
According to an embodiment of the present disclosure, a computer program product is also disclosed, comprising a computer program, wherein the computer program, when executed by a processor, realizes the steps of the image data acquisition method according to an embodiment of the present disclosure and its variants.
Although specific functionality is discussed above with reference to particular modules, it should be noted that the functionality of the various modules discussed herein may be divided into multiple modules and/or at least some of the functionality of multiple modules may be combined into a single module. Performing an action by a particular module discussed herein includes the particular module itself performing the action, or alternatively the particular module invoking or otherwise accessing another component or module that performs the action (or performs the action in conjunction with the particular module). Thus, a particular module that performs an action can include the particular module that performs the action itself and/or another module that the particular module invokes or otherwise accesses that performs the action. As used herein, the phrase "entity a initiates action B" may refer to entity a issuing instructions to perform action B, but entity a itself does not necessarily perform that action B. For example, the result generation unit "generate final image data" may be that the result generation unit sends an instruction to cause another unit to perform generation of image data, or the like, and the present disclosure is not limited thereto.
It should also be appreciated that various techniques may be described herein in the general context of software, hardware elements, or program modules. The various modules described above with respect to fig. 4 may be implemented in hardware or in hardware in combination with software and/or firmware. For example, the modules may be implemented as computer program code/instructions configured to be executed in one or more processors and stored in a computer-readable storage medium. Alternatively, the modules may be implemented as hardware logic/circuitry. For example, in some embodiments, one or more of the various modules or units described in embodiments of the present disclosure may be implemented together in a System on Chip (SoC). The SoC may include an integrated circuit chip (which includes one or more components of a Processor (e.g., a Central Processing Unit (CPU), microcontroller, microprocessor, Digital Signal Processor (DSP), etc.), memory, one or more communication interfaces, and/or other circuitry), and may optionally execute received program code and/or include embedded firmware to perform functions.
According to an aspect of the disclosure, a computing device is provided that includes a memory, a processor, and a computer program stored on the memory. The processor is configured to execute the computer program to implement the steps of any of the method embodiments described above.
According to an aspect of the present disclosure, a non-transitory computer-readable storage medium is provided, having stored thereon a computer program which, when executed by a processor, implements the steps of any of the method embodiments described above.
According to an aspect of the present disclosure, a computer program product is provided, comprising a computer program which, when executed by a processor, performs the steps of any of the method embodiments described above.
Illustrative examples of such computer devices, non-transitory computer-readable storage media, and computer program products are described below in connection with FIG. 5.
Fig. 5 illustrates an example configuration of a computer device 500 that may be used to implement the methods described herein. For example, the server 120 and/or the client device 110 shown in fig. 1 may include an architecture similar to the computer device 500. The image data acquisition device/arrangement described above may also be implemented in whole or at least in part by a computer device 500 or similar device or system.
Computer device 500 may be a variety of different types of devices, such as a server of a service provider, a device associated with a client (e.g., a client device), a system on a chip, and/or any other suitable computer device or computing system. Examples of computer device 500 include, but are not limited to: a desktop computer, a server computer, a notebook or netbook computer, a mobile device (e.g., a tablet, a cellular or other wireless telephone (e.g., a smartphone), a notepad computer, a mobile station), a wearable device (e.g., glasses, a watch), an entertainment device (e.g., an entertainment appliance, a set-top box communicatively coupled to a display device, a gaming console), a television or other display device, an automotive computer, and so forth. Thus, the computer device 500 may range from a full resource device with substantial memory and processor resources (e.g., personal computers, game consoles) to a low-resource device with limited memory and/or processing resources (e.g., traditional set-top boxes, hand-held game consoles).
The computer device 500 may include at least one processor 502, memory 504, communication interface(s) 506, display device 508, other input/output (I/O) devices 510, and one or more mass storage devices 512, which may be capable of communicating with each other, such as through a system bus 514 or other appropriate connection.
Processor 502 may be a single processing unit or multiple processing units, all of which may include single or multiple computing units or multiple cores. The processor 502 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitry, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the processor 502 can be configured to retrieve and execute computer-readable instructions stored in the memory 504, mass storage device 512, or other computer-readable medium, such as program code for an operating system 516, program code for an application 518, program code for other programs 520, and so forth.
Memory 504 and mass storage device 512 are examples of computer-readable storage media for storing instructions that are executed by processor 502 to implement the various functions described above. By way of example, the memory 504 may generally include both volatile and nonvolatile memory (e.g., RAM, ROM, and the like). In addition, mass storage device 512 may generally include a hard disk drive, solid state drive, removable media, including external and removable drives, memory cards, flash memory, floppy disks, optical disks (e.g., CD, DVD), storage arrays, network attached storage, storage area networks, and the like. Memory 504 and mass storage device 512 may both be referred to herein collectively as memory or computer-readable storage media, and may be non-transitory media capable of storing computer-readable, processor-executable program instructions as computer program code that may be executed by processor 502 as a particular machine configured to implement the operations and functions described in the examples herein.
A number of program modules may be stored on the mass storage device 512. These programs include an operating system 516, one or more application programs 518, other programs 520, and program data 522, and they may be loaded into memory 504 for execution. Examples of such applications or program modules may include, for instance, computer program logic (e.g., computer program code or instructions) for implementing the following components/functions: method 200 and/or method 300 (including any suitable steps of methods 200, 300), and/or additional embodiments described herein.
Although illustrated in fig. 5 as being stored in memory 504 of computer device 500, modules 516, 518, 520, and 522, or portions thereof, may be implemented using any form of computer-readable media that is accessible by computer device 500. As used herein, "computer-readable media" includes at least two types of computer-readable media, namely computer storage media and communication media.
Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, Digital Versatile Disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other non-transmission medium which can be used to store information for access by a computer device.
In contrast, communication media may embody computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave or other transport mechanism. Computer storage media, as defined herein, does not include communication media.
Computer device 500 may also include one or more communication interfaces 506 for exchanging data with other devices, such as over a network, a direct connection, and so forth, as previously discussed. Such communication interfaces may be one or more of the following: any type of network interface (e.g., a Network Interface Card (NIC)), wired or wireless (such as IEEE 802.11 Wireless LAN (WLAN)) wireless interface, worldwide interoperability for microwave Access (Wi-MAX) interface, Ethernet interface, Universal Serial Bus (USB) interface, cellular network interface, BluetoothTMAn interface, a Near Field Communication (NFC) interface, etc. The communication interface 506 may facilitate communication within a variety of networks and protocol types, including wired networks (e.g., LAN, cable, etc.) and wireless networks (e.g., WLAN, cellular, satellite, etc.), the Internet, and so forth. The communication interface 506 may also provide for communication with external storage devices (not shown), such as in storage arrays, network attached storage, storage area networks, and the like.
In some examples, a display device 508, such as a monitor, may be included for displaying information and images to a user. Other I/O devices 510 may be devices that receive various inputs from a user and provide various outputs to the user, and may include touch input devices, gesture input devices, cameras, keyboards, remote controls, mice, printers, audio input/output devices, and so forth.
While the disclosure has been illustrated and described in detail in the drawings and foregoing description, such illustration and description are to be considered illustrative and exemplary and not restrictive; the present disclosure is not limited to the disclosed embodiments. Variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed subject matter, from a study of the drawings, the disclosure, and the appended claims. In the claims, the word "comprising" does not exclude other elements or steps than those listed and the words "a" or "an" do not exclude a plurality. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.

Claims (16)

1. An image data acquisition method comprising:
acquiring first image data according to first scanning parameters, the first scanning parameters being determined based on scanning configuration data from a user;
in response to determining that the first image data meets a secondary acquisition condition, acquiring second image data according to a second scanning parameter; and
final image data is generated based on at least the second image data.
2. The method of claim 1, wherein the first and second light sources are selected from the group consisting of,
wherein determining that the first image data satisfies a secondary acquisition condition comprises: determining that the first image data includes a target feature corresponding to the scan configuration data, an
Wherein generating final image data based at least on the second image data comprises: final image data is obtained based on the first image data and the second image data.
3. The method of claim 2, further comprising generating the second scan parameter in response to determining that the first image data includes a target feature corresponding to the scan configuration data, wherein the second scan parameter includes a scan interval corresponding to the target feature.
4. The method of claim 2, wherein the scan configuration data comprises an analysis policy, and wherein determining that the first region of the first image data has a target feature corresponding to the scan configuration data comprises:
determining at least one feature to be detected based on the analysis strategy; and is
Determining that the target feature matched with the feature to be detected in the at least one feature to be detected exists in the image data.
5. The method of claim 4, further comprising, after generating final image data, performing an analytical process on the final image data based on the analysis policy to obtain an image analysis result, wherein the presence of the target feature in the image data that matches a feature to be detected of the at least one feature to be detected is determined by performing a pre-analytical process on the first image data, and wherein the pre-analytical process is based on a lower accuracy requirement than the analytical process.
6. The method of any of claims 1-5, wherein generating the final image data comprises:
generating coordinate correction data for at least one of the first image data and the second image data based on a first feature in the first image data and a second feature in the second image data corresponding to the first feature; and is
And superposing the first image data and the second image data based on the coordinate correction data to obtain the final image data.
7. The method of any of claims 1-5, wherein the second scan parameter includes a scan accuracy that is greater than a scan accuracy included in the first scan parameter.
8. The method of any of claims 2-5, wherein the second scan parameter includes at least one scan angle that is different from a scan angle included in the first scan parameter, and wherein the final image data includes three-dimensional reconstruction data for the target feature.
9. The method of any of claims 1-5, wherein the secondary acquisition condition includes the presence of a defective feature in the first image data, and
wherein generating final image data based at least on the second image data comprises: generating the final image data by replacing at least a portion of the first image data with the second image data.
10. The method of claim 9, wherein the defect features comprise global type defect features and local type defect features, and wherein the second scan parameters comprise scan intervals corresponding to the defect features.
11. The method of claim 9, further comprising the step of controlling a terminal to output a warning and stop generating the final image data in response to determining that the defective feature is present in the second image data.
12. The method of claim 9, further comprising:
in response to determining that the defect feature satisfies a human intervention criterion, controlling the terminal to output a warning and suspend acquiring second image data; and is
The step of acquiring the second image data is performed in response to receiving an instruction from a user.
13. An image data acquisition apparatus comprising:
a first acquisition unit configured to acquire first image data according to a first scanning parameter determined based on scanning configuration data from a user;
the second acquisition unit is used for responding to the fact that the first image data meet the secondary acquisition condition and acquiring second image data according to second scanning parameters; and
a result generation unit for generating final image data based on at least the second image data.
14. A computing device, comprising:
a memory, a processor, and a computer program stored on the memory,
wherein the processor is configured to execute the computer program to implement the steps of the method of any one of claims 1-12.
15. A non-transitory computer readable storage medium having a computer program stored thereon, wherein the computer program when executed by a processor implements the steps of the method of any of claims 1-12.
16. A computer program product comprising a computer program, wherein the computer program realizes the steps of the method of any one of claims 1-12 when executed by a processor.
CN202111250313.9A 2021-10-26 2021-10-26 Image data acquisition method and device, computing equipment and storage medium Pending CN113971754A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111250313.9A CN113971754A (en) 2021-10-26 2021-10-26 Image data acquisition method and device, computing equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111250313.9A CN113971754A (en) 2021-10-26 2021-10-26 Image data acquisition method and device, computing equipment and storage medium

Publications (1)

Publication Number Publication Date
CN113971754A true CN113971754A (en) 2022-01-25

Family

ID=79588424

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111250313.9A Pending CN113971754A (en) 2021-10-26 2021-10-26 Image data acquisition method and device, computing equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113971754A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115222805A (en) * 2022-09-20 2022-10-21 威海市博华医疗设备有限公司 Prospective imaging method and device based on lung cancer image

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115222805A (en) * 2022-09-20 2022-10-21 威海市博华医疗设备有限公司 Prospective imaging method and device based on lung cancer image

Similar Documents

Publication Publication Date Title
CN109961491B (en) Multi-mode image truncation compensation method, device, computer equipment and medium
CN112770838B (en) System and method for image enhancement using self-focused deep learning
EP2752783A2 (en) Apparatus and method for supporting acquisition of multi-parametric images
US11093699B2 (en) Medical image processing apparatus, medical image processing method, and medical image processing program
US20100128841A1 (en) Smoothing of Dynamic Data Sets
CN107252353B (en) Control method of medical imaging equipment and medical imaging equipment
EP4018371A1 (en) Systems and methods for accurate and rapid positron emission tomography using deep learning
WO2021041125A1 (en) Systems and methods for accurate and rapid positron emission tomography using deep learning
JP2014518125A (en) Follow-up image acquisition plan and / or post-processing
US20100177941A1 (en) Medical image diagnosis support system and image processing method
CN113971754A (en) Image data acquisition method and device, computing equipment and storage medium
US11923069B2 (en) Medical document creation support apparatus, method and program, learned model, and learning apparatus, method and program
CN114331992A (en) Image sequence processing method and device, computing equipment and storage medium
KR101941209B1 (en) Standalone automatic disease screening system and method based on artificial intelligence
Ravi et al. An efficient semi-supervised quality control system trained using physics-based MRI-artefact generators and adversarial training
CN108511052B (en) Method for determining a projection data set and projection determination system
WO2019235335A1 (en) Diagnosis support system, diagnosis support method, and diagnosis support program
CN114048738A (en) Data acquisition method, device, computing equipment and medium based on symptom description
US11776173B2 (en) Generating reformatted views of a three-dimensional anatomy scan using deep-learning estimated scan prescription masks
US8515149B2 (en) Inspection system and method for determining three dimensional model of an object
CN115546154B (en) Image processing method, device, computing equipment and storage medium
CN116327239A (en) Method, apparatus, computing device, and storage medium for assisting ultrasound scanning
US20220108451A1 (en) Learning device, method, and program, medical image processing apparatus, method, and program, and discriminator
US20240104802A1 (en) Medical image processing method and apparatus and medical device
US20230067053A1 (en) Analysis assisting device, analysis assisting system, and recording medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: Rooms 303, 304, 305, 321 and 322, building 3, No. 11, Chuangxin Road, science and Technology Park, Changping District, Beijing

Applicant after: Shukun Technology Co.,Ltd.

Address before: Rooms 303, 304, 305, 321 and 322, building 3, No. 11, Chuangxin Road, science and Technology Park, Changping District, Beijing

Applicant before: Shukun (Beijing) Network Technology Co.,Ltd.

CB02 Change of applicant information