CN116703908B - Imaging system testing method and device and imaging system - Google Patents

Imaging system testing method and device and imaging system Download PDF

Info

Publication number
CN116703908B
CN116703908B CN202310976430.6A CN202310976430A CN116703908B CN 116703908 B CN116703908 B CN 116703908B CN 202310976430 A CN202310976430 A CN 202310976430A CN 116703908 B CN116703908 B CN 116703908B
Authority
CN
China
Prior art keywords
image
test
interface
imaging system
value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310976430.6A
Other languages
Chinese (zh)
Other versions
CN116703908A (en
Inventor
徐婉君
姚玉成
袁佳佳
钱秀秀
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hefei Yofo Medical Technology Co ltd
Original Assignee
Hefei Yofo Medical Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hefei Yofo Medical Technology Co ltd filed Critical Hefei Yofo Medical Technology Co ltd
Priority to CN202310976430.6A priority Critical patent/CN116703908B/en
Publication of CN116703908A publication Critical patent/CN116703908A/en
Application granted granted Critical
Publication of CN116703908B publication Critical patent/CN116703908B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/40Image enhancement or restoration by the use of histogram techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10081Computed x-ray tomography [CT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20092Interactive image processing based on input by user
    • G06T2207/20104Interactive definition of region of interest [ROI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20216Image averaging

Abstract

The invention provides a testing method, a testing device and an imaging system of an imaging system, wherein the method obtains interface content characteristics of a testing interface of a control end by executing a pre-established script file, analyzes the interface content characteristics to identify the position of a control to be operated in the testing interface, triggers an action event of the control to be operated of the current testing interface according to the position of the control to be operated to enable a testing process to enter a next node, automatically shoots a phantom through a projection end and reconstructs a shooting image to obtain a three-dimensional CT image, compares the CT image with a corresponding first sample image acquired in advance to obtain an image deviation condition, and determines the testing result of the imaging system according to the image deviation condition. According to the invention, the operation of an operator is simulated by identifying the characteristics of the test interface, so that the manually operated part which can be standardized in the test process is changed into the automatic operation, and the test efficiency is improved.

Description

Imaging system testing method and device and imaging system
Technical Field
The present invention relates to the field of computer technologies, and in particular, to a method and an apparatus for testing an imaging system, and an imaging system.
Background
Currently, the testing of the imaging system is performed manually, for example, when the CBCT (Cone beam CT) system is tested, multiple operations are required manually, including, but not limited to, manually operating a computer end to enable the shooting process to continue, and observing the reconstructed image to determine whether the image quality meets the standard.
The image quality test of the CBCT system is carried out in a manual mode, more time cost and labor cost are needed, the test efficiency is low, misjudgment can be possibly caused by human eyes when the image quality is judged, and the judgment standards of personnel are influenced by subjective factors of the personnel and cannot be completely unified, so that the accuracy of the judgment result is difficult to meet the requirement.
Disclosure of Invention
In order to solve at least one of the technical problems, the invention provides a testing method and device of an imaging system and the imaging system.
The invention provides a testing method of an imaging system, which is characterized in that the imaging system comprises a projection end and a control end which are in communication connection with each other, the projection end is provided with a detection position for placing a phantom, and the testing method comprises the following steps: when the control end is used as an execution main body of a current node in the test process, interface content characteristics of a test interface of the control end are obtained through executing a script file which is created in advance; analyzing the interface content characteristics so as to identify the position of the control to be operated in the test interface; triggering an action event of the control to be operated on the current test interface according to the position of the control to be operated, so that the test process enters the next node, wherein the action event comprises a click event; after the body model is automatically shot through the projection end and a shot image is reconstructed to obtain a three-dimensional CT image, a first CT image of the three-dimensional CT image under at least one image layer is obtained; comparing the first CT image with a corresponding first sample image acquired in advance to obtain an image deviation condition; and determining a test result of the imaging system according to the image deviation condition.
According to one embodiment of the invention, the interface content features comprise an interface image, the interface image is a partial or complete image of the test interface, and the interface image comprises a control to be operated.
According to one embodiment of the invention, analyzing the interface content features to identify the position of a control to be operated in the test interface includes: the interface content features are used as a first interface image, the first interface image and a second sample image acquired in advance are compared, and a second sample image corresponding to the first interface image is identified; and determining the position of the control to be operated in the test interface according to the position of the control to be operated in the identified second sample image.
According to one embodiment of the present invention, the node used as the execution body by the control end in the test process includes at least one of the following: inputting test information, confirming the test information, starting shooting, and displaying CT images; when the node which is taken as an execution subject by the control end comprises a node for inputting test information, the action event also comprises a text input event.
According to one embodiment of the present invention, before the projection end automatically shoots the phantom and reconstructs the shot image to obtain a three-dimensional CT image, the test method further includes: when entering a positioning confirmation node, acquiring address information of the projection end from the control end; and sending a positioning confirmation instruction to the projection end according to the address information, so that the projection end feeds back a first signal representing the positioning completion, wherein the first signal is used for triggering the testing process to enter a next node of the positioning confirmation node.
According to one embodiment of the present invention, before the address information of the projecting end is acquired from the control end, the testing method further includes: and identifying the state value of the positioning marker bit, directly entering the next node of the positioning confirmation node when the state value is equal to a preset state value, and entering the positioning confirmation node when the state value is not equal to the preset state value.
According to one embodiment of the present invention, if the status value of the swing flag bit is not equal to the preset status value, the status value of the swing flag bit is changed to the preset status value before triggering the test procedure to enter the next node of the swing confirmation node.
According to one embodiment of the present invention, comparing the first CT image with a corresponding first sample image acquired in advance to obtain an image deviation condition includes: acquiring histograms of the first CT image and the corresponding first sample image; taking the histogram of the first CT image as a first histogram, taking the histogram of the corresponding first sample image as a second histogram, and calculating the difference value of the pixel number of the corresponding gray value in the first histogram and the second histogram; calculating a difference value between the first histogram and the second histogram according to the difference value; and when the difference value is smaller than a preset difference value, the obtained image deviation condition represents that the image quality is qualified, otherwise, the obtained image deviation condition represents that the image quality is unqualified.
According to one embodiment of the present invention, after obtaining the first CT image, the testing method further includes: acquiring at least one region of interest on the first CT image; determining a contrast noise index according to pixel values of the pixel points in the interest area; and when the test result of the imaging system is determined according to the image deviation condition, determining the test result of the imaging system according to the contrast noise condition.
According to one embodiment of the invention, each edge of the region of interest is parallel to a row or column of pixels in the first CT image, and the region of interest comprises regions of two different media and an intersection line of the two different media.
According to one embodiment of the present invention, determining a contrast noise figure according to pixel values of pixel points in the region of interest includes: determining an average value of each pixel array in all pixel arrays parallel to a first direction by taking a direction parallel to the boundary line as the first direction; and determining the contrast noise figure according to the arrangement of the average value of the pixel array in a second direction, wherein the second direction is perpendicular to the first direction.
According to one embodiment of the invention, determining the contrast noise figure from the arrangement of the average value of the pixel array in the second direction comprises: respectively determining the average value of the average values of all the pixel queues in a certain area at two sides of the pixel queue, and taking the difference value between the average values at two sides as a first difference value; determining the difference value between the average value difference values of the adjacent pixel queues to obtain a second difference value of the pixel queues; and taking the pixel queue corresponding to the maximum value in the second difference value as a first index queue, taking the pixel queue corresponding to the minimum value in the second difference value as a second index queue, and calculating the contrast noise index according to the average value of the first index queue and the second index queue.
The second aspect of the present invention provides a testing device of an imaging system, the imaging system includes a projection end and a control end which are communicatively connected with each other, the projection end is provided with a detection position for placing a phantom, the testing device includes: a memory storing execution instructions; and a processor executing the execution instructions stored in the memory, causing the processor to execute the test method of the imaging system according to any one of the embodiments.
In a third aspect of the present invention, an imaging system is provided, the imaging system including a projection end and a control end which are communicatively connected to each other, the projection end being provided with a detection position for placing a phantom, the control end including: a memory storing execution instructions; and a processor executing the execution instructions stored in the memory, causing the processor to execute the test method of the imaging system according to any one of the embodiments.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the principles of the invention.
Fig. 1 is a flow chart of a testing method of an imaging system according to an embodiment of the present invention.
FIG. 2 is a flow diagram of identifying a position of a control to be operated according to one embodiment of the invention.
Fig. 3 is a flow chart of a testing method of an imaging system according to another embodiment of the present invention.
Fig. 4 is a flow chart of a case of obtaining an image deviation according to an embodiment of the present invention.
Fig. 5 is a schematic view of a phantom cut according to one embodiment of the invention.
FIG. 6 is a schematic diagram of a test setup of an imaging system employing a hardware implementation of a processing system, according to one embodiment of the invention.
Detailed Description
The present invention will be described in further detail with reference to the drawings and embodiments. It is to be understood that the specific embodiments described herein are merely illustrative of the substances, and not restrictive of the invention. It should be further noted that, for convenience of description, only the portions related to the present invention are shown in the drawings.
In addition, the embodiments of the present invention and the features of the embodiments may be combined with each other without collision. The technical scheme of the present invention will be described in detail below with reference to the accompanying drawings in combination with embodiments.
Unless otherwise indicated, the exemplary implementations/embodiments shown are to be understood as providing exemplary features of various details of some of the ways in which the technical concepts of the present invention may be practiced. Thus, unless otherwise indicated, the features of the various implementations/embodiments may be additionally combined, separated, interchanged, and/or rearranged without departing from the technical concepts of the present invention.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. Furthermore, when the terms "comprises" and/or "comprising," and variations thereof, are used in the present specification, the presence of stated features, integers, steps, operations, elements, components, and/or groups thereof is described, but the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof is not precluded. It is also noted that, as used herein, the terms "substantially," "about," and other similar terms are used as approximation terms and not as degree terms, and as such, are used to explain the inherent deviations of measured, calculated, and/or provided values that would be recognized by one of ordinary skill in the art.
Taking a CBCT system as an example, the testing method, the testing device and the imaging system of the imaging system are described below with reference to the accompanying drawings.
Fig. 1 is a flow chart of a testing method of an imaging system according to an embodiment of the present invention. Referring to fig. 1, a test method M10 of an imaging system of the present embodiment may include steps S100, S200, S300, S400, S500, and S600. The imaging system comprises a projection end and a control end which are in communication connection with each other, and the projection end is provided with a detection position for placing a phantom.
S100, when the control end is used as an execution main body of the current node in the test process, the interface content characteristics of the test interface of the control end are obtained through executing the script file which is created in advance.
And S200, analyzing the interface content characteristics so as to identify the position of the control to be operated in the test interface.
And S300, triggering an action event of the control to be operated on the current test interface according to the position of the control to be operated, so that the test process enters the next node, and the action event comprises a click event.
S400, after the body model is automatically shot through the projection end and the shot image is reconstructed to obtain a three-dimensional CT image, a first CT image of the three-dimensional CT image under at least one image layer is obtained.
S500, comparing the first CT image with a corresponding first sample image acquired in advance to obtain an image deviation condition.
S600, determining a test result of the imaging system according to the image deviation condition.
According to the testing method of the imaging system, provided by the embodiment of the invention, the operation of an operator is simulated by identifying the characteristics of the testing interface, so that a part which can be standardized in the testing process is changed from manual operation to automatic operation, and the obtained images are automatically compared and detected, the testing efficiency is improved, the labor cost of the testing is reduced, and the accuracy of the testing result is improved.
The projection end is used as controlled equipment and used for scanning and projecting the detected object, so as to obtain projection data under different scanning angles. The projection end may include a source, detector, and other components as desired. The control end is used as control equipment, can be connected with the projection end in a wired or wireless mode, and controls the projection end to scan and project through signal interaction with the projection end. The control end is configured with a display for displaying information such as a test interface, test data and the like in the test process. The control end can be a computer, and the control end is equivalent to the computer end.
The projection end and the control end are usually arranged separately and are positioned at different positions, so that the damage of X-rays to operators nearby during scanning is avoided. For example, the projection end source, detector, etc. may be disposed in the lead chamber, with the lead chamber being closed during scanning to avoid radiation leakage.
In performing the testing of the imaging system, a phantom may be placed on the detection site of the projection end. The body model can be a cylinder and can be composed of a plurality of modules, and the end face of the body model is arranged downwards when the body model is arranged. The phantom can be used for detecting performance indexes such as image uniformity, high contrast resolution, low contrast resolution, ranging error and the like of imaging systems such as an oral cavity CT system, an oral cavity cone beam CT system and the like, and can be used for evaluating indexes such as signal-to-noise ratio, spatial resolution, image gray value uniformity and the like of CBCT images. The detection position can be provided with a mark, and the body model is placed at the mark position, so that the body model placement is completed.
The testing process is mainly performed through testing software, and the testing software guides an operator to perform corresponding operations, such as confirming testing information, triggering shooting tasks and the like, until the testing process is completed. The test method M10 is mainly used for automatically operating the test software by executing the script file so as to realize the automatic operation of the test process.
The script file is a file which is created in advance and can automatically control the imaging system to act. The script file can be written in Python language, and can be used as an automation tool of the embedded equipment to directly run on a control end (such as a computer end) or run on a separate device in communication connection with the control end. Before the script file is used for implementing automatic test on the imaging system, the imaging system with all performance indexes reaching standards is used for imaging the phantom, and the obtained CT image is collected and stored under a designated path position to be used as a comparison image (gold sample) in the subsequent test of other imaging systems. All comparative examples used in the test process need to be collected and stored before the test, the images can form corresponding CT image sets or CT image libraries according to the nodes, and the images can be read from the CT image libraries of the corresponding nodes for comparison when the images need to be used.
The whole testing process of the imaging system can be divided into a plurality of nodes, and the division basis of the nodes can be that each time the imaging system needs to be operated to enable the testing process to be continued, the operation to be performed corresponds to one node. Among all the nodes included in the test process, the to-be-operated ends corresponding to part of the nodes are control ends, and the to-be-operated ends corresponding to the other part of the nodes are projection ends. The script file is executed according to the preset node sequence and the corresponding operation thereof when running, so that the whole test flow can be automatically completed.
The execution subject refers to a subject to which an operation is to be performed. If the current node can be completed only by operating at the control end, the current execution main body is the control end. The same applies to the projection end.
Before the test process starts, the phantom is placed at the detection position of the projection end, and a script file is executed. The test procedure is then triggered to start at the control end. When the test process starts, an automatic operation is generally needed to be performed on the control end to advance the test flow, i.e. the execution subject of the first node is the control end. Assuming that the first node corresponds to logging test information, a test task is created by logging test information, which may include test object number, test time, test item, phantom type, etc. information items. The test object sequence number and test time can be automatically generated through a script file, and the test item and the model type can automatically adopt default values.
At this time, the running script can acquire the interface content characteristics of the test interface of the control end, and determine the position of the control to be operated in the test interface by analyzing the interface content characteristics. The control to be operated can comprise a button, for example, a confirmation button is arranged on the current test interface, the script automatically obtains the position of the confirmation button through the acquisition and analysis of the content characteristics of the interface, and automatically triggers the action event of the confirmation button to enter the next step. The action event corresponding to the button is usually a click event, the script can be packaged with an auto library, and the button is automatically clicked to enter the next node by calling the auto library and utilizing a simulated mouse and a keyboard on a test interface of the control end. Compared with manual operation, the method has the advantages that an operator does not need to control the mouse to move to the confirmation button to click, so that the testing speed is increased, and the labor is saved.
And the script performs feature acquisition, feature analysis, position identification and control action triggering according to the process when the testing end is used as an execution main body of the current node. When the projection end is the execution subject, for example, after finishing positioning confirmation, the projection node can be entered from the positioning confirmation node, and at this time, the projection end starts scanning the phantom as the execution subject and obtains a projection image. The projection end and the control end can be connected through a network cable, after the body model of the detection position is scanned through the automatic control projection end, the projection end can upload a projection image to the control end through the network cable, and the control end reconstructs the projection image to obtain a three-dimensional CT image.
The script acquires a first CT image of the three-dimensional CT image below one or more slice layers. The first CT image is an image of an axial layer surface of a phantom three-dimensional voxel array (three-dimensional CT image) reconstructed by CT, the axial layer surface is perpendicular to a rotation shaft when a projection end rotates and scans, the rotation shaft is a central shaft around which a source and a detector rotate in the scanning process, a projection image under a scanning angle is obtained after each rotation angle, and the phantom three-dimensional voxel array is obtained by reconstructing the projection image under each scanning angle.
The control end can compare the first CT image with the corresponding comparison image, for example, automatically select 10 layers with different axial positions, so as to obtain 10 first CT images. Where "axial" refers to the direction of a vertical axis that is parallel to the radiation receiving area of the detector and perpendicular to the cross-section of the phantom. And respectively comparing the first CT images with 10 corresponding first sample images in a CT image library to obtain 10 image deviation conditions. The lower image deviation indicates that the image quality is qualified, and the higher image deviation indicates that the image quality is unqualified. By automatically carrying out image recognition, whether the quality of the contrast image meets the requirement or not can realize the quantification of the image result, replaces naked eye observation and judgment, increases the accuracy of the result and reduces the misjudgment risk of manual recognition.
The test result of the imaging system can be obtained through the image deviation conditions, so that performance indexes such as scanning capacity and reconstruction capacity of the imaging system are evaluated. The evaluation criteria for the imaging system may be: and if the quality of most of the images is qualified, the scanning capability and the reconstruction capability of the imaging system can meet the requirements, and if the quality of most of the images is unqualified, the scanning capability and the reconstruction capability of the imaging system cannot meet the requirements. The evaluation criteria for whether the quality of one image is acceptable may be, for example: if the image deviation exceeds, for example, a configured scale threshold (e.g., 5% or 10%), then the image quality is indicated as unacceptable.
It can be understood that the projection terminal can be controlled to automatically and circularly shoot for a plurality of times through the script, and a plurality of image deviation conditions obtained after the plurality of times of shooting are compared, so that the consistency of shooting before and after the shooting is verified. In addition, after the test result of the imaging system is obtained, a corresponding test report can be generated and stored in a designated location or directly sent to a mailbox of a designated person.
For example, the interface content features may include an interface image, which is a partial or complete image of the test interface, and in which the controls to be operated are included.
If the interface content feature includes an interface image, the position of the confirm button is identified by performing image analysis in step S200. The interface image can be a complete window screenshot of the test interface or a partial region screenshot of the test interface, and if the to-be-operated controls of each node are all located in the lower half region of the test interface, the screenshot of the lower half region of the test interface can be used as the interface image.
FIG. 2 is a flow diagram of identifying a position of a control to be operated according to one embodiment of the invention. Referring to fig. 2, step S200 may include step S210 and step S220.
S210, comparing the first interface image with a second sample image acquired in advance by taking the interface content characteristics as a first interface image, and identifying the second sample image corresponding to the first interface image.
S220, determining the position of the control to be operated in the test interface according to the position of the control to be operated in the identified second sample image.
The second sample image can be obtained by capturing a screen of the whole test interface, and a screen capture image library can be formed through the second sample image of each node. The test interface may be a confirm test information interface, a trigger shooting task interface, a CT image display interface, and the like. And comparing the first interface image with a plurality of second sample images respectively corresponding to different nodes in the screenshot image library one by one, wherein if the first interface image is a complete image of the test interface, the first interface image and the second sample images have the same size, the comparison mode can be to sequentially compare pixel points at the same positions of the two images, and if the number of the pixel points with the consistent pixel values meets the number requirement, the nodes of the two images are corresponding, so that the second sample image corresponding to the current node is identified from the second sample images of the plurality of different nodes.
The second sample image may also be obtained by capturing a partial region of the test interface, where the partial region has the same size and position in the entire test interface as compared to the partial region selected when the interface image was captured. Therefore, the sizes of the first interface image and the second sample image are the same, the comparison mode can be to compare the pixel points at the same position of the two images in sequence, if the number of the pixel points with the same pixel value meets the number requirement, the nodes of the two images are corresponding, and therefore the second sample image corresponding to the current node is identified.
When the second sample image is acquired, the position of the control to be operated in the second sample image can be identified and recorded in advance. After the second sample image corresponding to the current node is identified, the position of the control to be operated in the second sample image can be directly read as the position of the control to be operated in the first interface image, so that a click event is triggered based on the position.
Illustratively, the node that is the subject of execution by the control end during the test may include at least one of: and inputting test information, confirming the test information, starting shooting and displaying CT images. When the node which is taken as the execution main body by the control end comprises a node for inputting test information, the action event also comprises a text input event.
The text information is used for creating a test task, wherein the test object serial number, the test time, the test item, the model type and other information items are required to be input into test software. At the moment, the script file can be used for automatically inputting, in the inputting process, the control to be operated is a text box, and the action event is a text input event.
It is assumed that at the initial moment of the start of the test procedure, the first node is the input of test information, and the control end is the current execution subject. At this time, a plurality of text boxes and a confirmation button are arranged on the test interface, and the control to be operated corresponding to the first node is each text box. The script file can identify the position of each text box according to the setting, and the information such as the test object serial number, the test time, the test item, the body model type and the like is respectively filled in the corresponding text boxes by triggering the text input event, wherein the test object serial number and the test time can be automatically generated through the script file, and the test item and the body model type can automatically adopt preset default values.
After the input of the test information is completed, the first node is completed, the script automatically enters the second node, the second node confirms the test information, the current execution main body is also a control end, and the control to be operated corresponding to the first node is a confirmation button. At this time, the script file recognizes the position of the confirm button according to the setting, and automatically confirms the test information by triggering the click event, thereby entering into the third node.
Fig. 3 is a flow chart of a testing method of an imaging system according to another embodiment of the present invention. Referring to fig. 3, the test method M10 may further include step S380 and step S390 before step S400.
S380, when entering the positioning confirmation node, the address information of the projection end is acquired from the control end.
S390, sending a positioning confirmation instruction to the projection terminal according to the address information, so that the projection terminal feeds back a first signal representing the positioning completion, and the first signal triggers the testing process to enter the next node of the positioning confirmation node.
Before starting the shooting node, positioning confirmation can be performed, so that the positioning confirmation node can be further included in the test process. The positioning confirmation node may be a third node or a node after the third node, and the execution body of the positioning confirmation node is a projection end.
After entering the positioning confirmation node, the script can acquire the address information of the projection end. The address information can be serial port information, the script can send the connection instruction to the projection end according to the address information by calling the pyrerval library, and the projection end replies a message representing the connection success after receiving the connection instruction. The pyrerval library is used for serial communication between devices. And after receiving the message, the script calls the pyrseries port module to send a positioning confirmation instruction to the projection terminal according to the communication protocol.
After receiving the positioning confirmation instruction, the projection end automatically feeds back a first signal to the control end. After the control end receives the first signal, the script can enter the test of the next node. The positioning confirmation instruction is used for confirming the state of the projection end, and aims to prevent unnecessary radiation on the detected object caused by starting scanning when the positioning is not completed. The touch screen of the projection end is arranged in the lead room as the projection end, when the manual test is carried out, an operator needs to go to the lead room from the control end to click a confirmation button on the touch screen, then the lead room is closed and the control end is returned to continue operation. Because the body model is put in advance, the examination and the confirmation are not needed again, the communication between the projection end and the control end can be automatically controlled through the script, an operator does not need to press a confirmation button of the touch screen back and forth between the projection end and the control end in the test process, the test efficiency is further improved, and the test time is saved.
The shooting node is positioned behind the positioning confirmation node, and the projection end is used as an execution subject. The control end can respond to the first signal and send a shooting instruction to the projection end through the serial port. The projection end automatically starts scanning the phantom of the test bit and obtains a projection image as scanning data, meanwhile, the projection image can be uploaded to the control end through the network port, the control end can reconstruct the projection image to obtain a three-dimensional CT image, and the control end can compare the images to obtain an image deviation condition.
Illustratively, with continued reference to FIG. 3, the test method M10 may further include a step S370 prior to the step S380.
And S370, identifying the state value of the positioning marker bit, directly entering the next node of the positioning confirmation node when the state value is equal to the preset state value, and entering the positioning confirmation node when the state value is not equal to the preset state value.
The script is provided with a swing flag bit, and the swing flag bit is used for determining whether a swing confirmation process needs to be executed. The value of the swing flag bit and the preset state value can be expressed by using a Boolean value. Assuming that the preset state value is "TRUE", if the value of the positioning flag bit is also "TRUE", the projection terminal can be directly triggered to automatically shoot without performing step S380 and step S390, so that the test time is saved. If the value of the positioning flag bit is "FALSE", entering a positioning confirmation node, executing step S380 and step S390 to perform positioning confirmation process, and then triggering the projection terminal to perform automatic shooting.
If the state value of the positioning flag bit is not equal to the preset state value, the state value of the positioning flag bit is changed into the preset state value before the test process is triggered to enter the next node of the positioning confirmation node. By modifying the value of the positioning zone bit and enabling the value of the positioning zone bit to be consistent with the preset state value, whether the positioning is clicked or not can not be judged in the follow-up test process when the shooting interface is entered before each shot, the judgment process is skipped directly, the state after the positioning is completed is entered, and the test efficiency is improved.
Fig. 4 is a flow chart of a case of obtaining an image deviation according to an embodiment of the present invention. Referring to fig. 4, step S500 may include step S510, step S520, step S530, and step S540.
S510, acquiring histograms of the first CT image and the corresponding first sample image.
S520, calculating the difference value of the pixel point number of the corresponding gray value in the first histogram and the second histogram by taking the histogram of the first CT image as the first histogram and the histogram of the corresponding first sample image as the second histogram.
S530, calculating a difference value between the first histogram and the second histogram according to the difference value.
S540, when the difference value is smaller than the preset difference value, the obtained image deviation condition represents that the image quality is qualified, otherwise, the obtained image deviation condition represents that the image quality is unqualified.
The histogram may be a gray level histogram in which the gray level value of the image and the number of pixels thereof are recorded. Let the first histogram be Ah and the second histogram be Bh. The number of pixels of the first histogram Ah at the ith gray value is a i The number of pixels of the second histogram Bh at the ith gray value is b i The difference value of the pixel point number of the ith gray value in the first histogram Ah and the second histogram Bh is a i -b i . The difference value differ of the first histogram Ah and the second histogram Bh can be calculated by the following formula:
where n is the number of kinds of gray values in the first histogram Ah, i.e. the number of kinds of gray values in the second histogram Bh. And comparing whether the number of pixels with the same gray value in the two histograms is the same or not to obtain the difference value of the two histograms. If the number of pixel points for each gray value is the same in both histograms, the difference value of both histograms is 0.
The preset difference value C may be set as: c=pct×pixnum×0.001, where pct is a preset error percentage, pixNum is the total number of pixels of the first CT image, that is, the product of width and height, and may also be the total number of pixels of the first sample image.
If the difference value is less than or equal to C, the first CT image is considered to be matched with the first sample image, the image deviation condition meets the requirement, otherwise, the first CT image is considered to be not matched with the first sample image, and the image deviation condition fails to meet the requirement.
Fig. 5 is a schematic view of a phantom cut according to one embodiment of the invention. Referring to fig. 5, after the first CT image is obtained in step S400, the testing method M10 may further include the steps of: at least one region of interest on the first CT image is acquired. And determining the contrast noise index according to the pixel values of the pixel points in the interest area. And, when executing step S600, the test result of the imaging system may also be determined according to the contrast noise condition.
The intermediate portion of the phantom may also be used to test the contrast noise of the CBCT image during the test. Fig. 6 is an image of a cross section of a layer of a phantom, i.e., a first CT image, wherein the H1 and H4 modules may be made of PMMA (polymethyl methacrylate) material, the H2 module may be made of PVC (polyvinyl chloride) material, and the region of H3 may be air. The ROI is the region of interest. A rectangular region of interest ROI is selected on the first CT image with the axial slice being the slice, in fig. 5 the "axial" direction being the direction perpendicular to both the X-axis and the Y-axis.
Each side of the region of interest ROI may be parallel to a row or column of pixels in the first CT image, and the region of interest may contain regions of two different media and boundary lines of the two different media.
Assuming that each row of pixels in the first CT image is ordered along the X-axis and each column of pixels is ordered along the Y-axis, two opposite sides of the region of interest ROI are parallel to the direction of the pixel row (X-axis) and the other two opposite sides are parallel to the direction of the pixel column (Y-axis). At the same time, the region of interest ROI contains both regions formed of PVC material and regions formed of air, that is to say, the region of interest ROI characterizes the transition zone between PVC and air, on which the contrast noise is calculated. The boundary line may be parallel to two opposite sides of the region of interest ROI to facilitate subsequent calculations.
Of the side lengths of the region of interest ROI, the side length parallel to the boundary line may correspond to a length of 10 mm (tolerance of 1 pixel) in the phantom, and the side length perpendicular to the boundary line may correspond to a length of 3 mm in the phantom, and the calculation is facilitated by setting a scale, so that the calculation process may be performed at a given scale.
For example, the manner of determining the contrast noise figure from the pixel values of the pixel points in the region of interest may comprise the steps of: the average value of each of all pixel queues parallel to the first direction is determined with the direction parallel to the boundary line as the first direction. And determining the contrast noise figure according to the arrangement of the average value of the pixel array in a second direction, wherein the second direction is perpendicular to the first direction.
With continued reference to fig. 5, the first direction is the X-axis direction and the second direction is the Y-axis direction. The pixel values of the region of interest ROI can be read in the X-axis direction parallel to the boundary line, that is, the pixel values are obtained one by one in the direction of the pixel rows, where each row of pixels is a pixel queue, and the number of pixel queues is the number of pixel columns of the region of interest ROI.
The average values P (1), P (2), P (3), …, P (k) and standard deviations S (1), S (2), S (3), …, S (k) for each row of pixels are then calculated and consecutively numbered and recorded, where k is the total number of pixel rows of the region of interest ROI, i.e. the number of pixel queues.
For example, the manner of determining the contrast noise figure from the arrangement of the average value of the pixel array in the second direction may comprise the steps of: and respectively determining the average value of the average values of all the pixel queues in a certain area at two sides of the pixel queue, and taking the difference value between the average values at two sides as a first difference value. And determining the difference value between the average value difference values of the adjacent pixel queues to obtain a second difference value of the pixel queues. And taking the pixel queue corresponding to the maximum value in the second difference value as a first index queue, taking the pixel queue corresponding to the minimum value in the second difference value as a second index queue, and calculating the contrast noise figure according to the average value of the first index queue and the second index queue.
After the average value of the pixels of each pixel array is obtained, determining the pixel array which has a certain distance to two opposite sides of the region of interest (ROI) in the second direction as a target pixel array, obtaining the average value of a plurality of pixel arrays adjacent to one side of the target pixel array in the second direction, obtaining the difference value of the average values, and obtaining the average value of a plurality of pixel arrays adjacent to the other side of the target pixel array in the second direction, and obtaining the difference value of the average values. The average value of the current target pixel queue itself is also taken into account when calculating the average value difference corresponding to one of the sides.
For example, the adjacent plural pixel queues along the positive direction side of the Y-axis are set to 4, the adjacent plural pixel queues along the negative direction side of the Y-axis are set to 4, and the current target is setThe average value of the pixel array itself is calculated into the calculation of the average value difference value on one side of the positive direction of the Y axis. For the target pixel queue m, the first difference is:. Wherein (1)>For the first difference value corresponding to the target pixel queue m,/->For the pixel mean value of the target pixel queue m, < >>Average value of adjacent 4 pixel queues along positive direction side of Y axis, +.>An average of adjacent 4 pixel queues along one side of the negative Y-axis direction. The subsequent calculation results are stabilized by setting a sliding window area of 4+5=9 pixel queues.
The boundary line can be accurately located by the first difference. For each first difference obtained, the second difference is:. Wherein (1)>The second difference value P (6) ", P (7)", …, P (n-5) ", which corresponds to the target pixel array m, is thus obtained. The transition zone produces two extreme values of curvature along the pixel value averages P1, P2, P3, …, pk, which are included in the second difference value by which the positions of the maxima and minima on the transition zone are located.
The maximum value Pm (max) "and the minimum value Pm (min)" are determined from the second differences P (6) ", P (7)", …, P (n-5) ", and further the first index queue m (max) and the second index queue m (min) are determined, and then the contrast noise figure CNI is calculated by using the following formula:
wherein, the liquid crystal display device comprises a liquid crystal display device,for the pixel mean value of the first index queue m (max), i.e. the mean value of the pixel values of the m (max) th row, +.>The average value of the pixels in the second index line m (min), that is, the average value of the pixel values in the m (min) th line, is measured as the image contrast by the jump height between Pm (max) and Pm (min). />For the variance of the pixel values of the first index queue m (max), +.>Is the variance of the pixel values of the second index queue m (min).
FIG. 6 is a schematic diagram of a test setup of an imaging system employing a hardware implementation of a processing system, according to one embodiment of the invention. Referring to fig. 6, the test apparatus 1000 of the present embodiment may include a memory 1300 and a processor 1200. The memory 1300 stores execution instructions that the processor 1200 executes to cause the processor 1200 to perform the method of testing an imaging system of any of the embodiments described above.
The apparatus 1000 may include corresponding modules that perform the steps of the flowcharts discussed above. Thus, each step or several steps in the flowcharts described above may be performed by respective modules, and the apparatus may include one or more of these modules. A module may be one or more hardware modules specifically configured to perform the respective steps, or be implemented by a processor configured to perform the respective steps, or be stored within a computer-readable medium for implementation by a processor, or be implemented by some combination.
For example, the test apparatus 1000 may include an interface feature acquisition module 1002, a control location identification module 1004, an action event trigger module 1006, a CT image acquisition module 1008, an image contrast module 1010, and a test result determination module 1012. The test device 1000 may be built into the control end of the imaging system. The control terminal may be a computer.
The interface feature obtaining module 1002 is configured to obtain, when the control terminal is used as an execution subject of a current node in a testing process, an interface content feature of a testing interface of the control terminal by executing a script file created in advance.
The control position identification module 1004 is configured to analyze the content characteristics of the interface so as to identify the position of the control to be operated in the test interface.
The action event triggering module 1006 is configured to trigger an action event of the control to be operated on the current test interface according to the position of the control to be operated, so that the test process enters the next node, and the action event includes a click event.
The CT image acquiring module 1008 is configured to acquire a first CT image of the three-dimensional CT image under at least one image layer after automatically capturing a phantom through the projection end and reconstructing the captured image to obtain the three-dimensional CT image.
The image comparison module 1010 is configured to compare the first CT image with a corresponding first sample image acquired in advance to obtain an image deviation condition; and
the test result determining module 1012 is used for determining a test result of the imaging system according to the image deviation condition.
It should be noted that, details not disclosed in the testing device 1000 of the imaging system in the present embodiment may refer to details disclosed in the testing method M10 of the imaging system in the foregoing embodiment, which are not described herein.
The hardware architecture may be implemented using a bus architecture. The bus architecture may include any number of interconnecting buses and bridges depending on the specific application of the hardware and the overall design constraints. Bus 1100 connects together various circuits including one or more processors 1200, memory 1300, and/or hardware modules. Bus 1100 may also connect various other circuits 1400, such as peripherals, voltage regulators, power management circuits, external antennas, and the like.
Bus 1100 may be an industry standard architecture (ISA, industry Standard Architecture) bus, a peripheral component interconnect (PCI, peripheral Component) bus, or an extended industry standard architecture (EISA, extended Industry Standard Component) bus, among others. The buses may be divided into address buses, data buses, control buses, etc. For ease of illustration, only one connection line is shown in the figure, but not only one bus or one type of bus.
The invention also provides an imaging system which comprises a projection end and a control end which are in communication connection with each other, wherein the projection end is provided with a detection position for placing a phantom. The control terminal comprises a memory and a processor. The memory stores execution instructions that the processor executes, causing the processor to execute the test method of the imaging system of any of the above embodiments.
It should be noted that, details not disclosed in the imaging system of the present embodiment may refer to details disclosed in the testing method M10 of the imaging system of the foregoing embodiment, which are not described herein.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and further implementations are included within the scope of the preferred embodiment of the present invention in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present invention. The processor performs the various methods and processes described above. For example, method embodiments of the present invention may be implemented as a software program tangibly embodied on a machine-readable medium, such as a memory. In some embodiments, part or all of the software program may be loaded and/or installed via memory and/or a communication interface. One or more of the steps of the methods described above may be performed when a software program is loaded into memory and executed by a processor. Alternatively, in other embodiments, the processor may be configured to perform one of the methods described above in any other suitable manner (e.g., by means of firmware).
Logic and/or steps represented in the flowcharts or otherwise described herein may be embodied in any readable storage medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions.
It should be understood that portions of the present invention may be implemented in hardware, software, or a combination thereof. In the above-described embodiments, the various steps or methods may be implemented in software stored in a memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, may be implemented using any one or combination of the following techniques, as is well known in the art: discrete logic circuits having logic gates for implementing logic functions on data signals, application specific integrated circuits having suitable combinational logic gates, programmable Gate Arrays (PGAs), field Programmable Gate Arrays (FPGAs), and the like.
Those of ordinary skill in the art will appreciate that all or part of the steps implementing the method of the above embodiments may be implemented by a program to instruct related hardware, and the program may be stored in a readable storage medium, where the program when executed includes one or a combination of the steps of the method embodiments. The storage medium may be a volatile/nonvolatile storage medium.
In addition, each functional unit in each embodiment of the present invention may be integrated into one processing module, each unit may exist alone physically, or two or more units may be integrated into one module. The integrated modules may be implemented in hardware or in software functional modules. The integrated modules may also be stored in a readable storage medium if implemented in the form of software functional modules and sold or used as a stand-alone product. The storage medium may be a read-only memory, a magnetic disk or optical disk, etc.
The invention also provides a readable storage medium, wherein the readable storage medium stores execution instructions which are used for realizing the testing method of the imaging system in the embodiment when being executed by a processor.
In the description of the present specification, the descriptions of the terms "one embodiment/mode," "some embodiments/modes," "specific examples," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment/mode or example is included in at least one embodiment/mode or example of the present invention. In this specification, the schematic representations of the above terms are not necessarily the same embodiments/modes or examples. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments/modes or examples. Furthermore, the various embodiments/implementations or examples described in this specification and the features of the various embodiments/implementations or examples may be combined and combined by persons skilled in the art without contradiction.
Furthermore, the terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include at least one such feature. In the description of the present invention, the meaning of "plurality" means at least two, for example, two, three, etc., unless specifically defined otherwise.
It will be appreciated by persons skilled in the art that the above embodiments are provided for clarity of illustration only and are not intended to limit the scope of the invention. Other variations or modifications will be apparent to persons skilled in the art from the foregoing disclosure, and such variations or modifications are intended to be within the scope of the present invention.

Claims (10)

1. The test method of the imaging system is characterized in that the imaging system comprises a projection end and a control end which are in communication connection with each other, the projection end is provided with a detection position for placing a phantom, and the test method comprises the following steps:
when the control end is used as an execution main body of a current node in the test process, interface content characteristics of a test interface of the control end are obtained through executing a script file which is created in advance;
Analyzing the interface content characteristics so as to identify the position of the control to be operated in the test interface;
triggering an action event of the control to be operated on the current test interface according to the position of the control to be operated, so that the test process enters the next node, wherein the action event comprises a click event;
after the body model is automatically shot through the projection end and a shot image is reconstructed to obtain a three-dimensional CT image, a first CT image of the three-dimensional CT image under at least one image layer is obtained;
comparing the first CT image with a corresponding first sample image acquired in advance to obtain an image deviation condition; and
determining a test result of the imaging system according to the image deviation condition; wherein, the liquid crystal display device comprises a liquid crystal display device,
after obtaining the first CT image, the testing method further includes:
acquiring at least one region of interest on the first CT image, wherein each edge of the region of interest is parallel to a pixel row or a pixel column in the first CT image, and the region of interest comprises regions of two different media and boundary lines of the two different media;
determining an average value of each pixel array in all pixel arrays parallel to a first direction by taking a direction parallel to the boundary line as the first direction; and
Determining a contrast noise figure according to the arrangement of the average value of the pixel array in a second direction, wherein the second direction is perpendicular to the first direction; wherein, the liquid crystal display device comprises a liquid crystal display device,
determining the contrast noise figure according to the arrangement of the average value of the pixel array in the second direction comprises:
respectively determining the average value of the average values of all the pixel queues in a certain area at two sides of the pixel queue, and taking the difference value between the average values at two sides as a first difference value;
determining the difference value between the average value difference values of the adjacent pixel queues to obtain a second difference value of the pixel queues; and
taking a pixel queue corresponding to the maximum value in the second difference value as a first index queue, taking a pixel queue corresponding to the minimum value in the second difference value as a second index queue, and calculating a contrast noise figure according to the average value of the first index queue and the second index queue;
and when determining the test result of the imaging system according to the image deviation condition, determining the test result of the imaging system according to the contrast noise index.
2. The test method of claim 1, wherein the interface content features comprise an interface image, the interface image being a partial or complete image of the test interface, and the interface image comprising controls to be operated.
3. The method of testing of claim 2, wherein analyzing the interface content features to identify the location of a control to be operated in the test interface comprises:
the interface content features are used as a first interface image, the first interface image and a second sample image acquired in advance are compared, and a second sample image corresponding to the first interface image is identified; and
and determining the position of the control to be operated in the test interface according to the position of the control to be operated in the identified second sample image.
4. The test method according to claim 1, wherein the node used as the execution body by the control end in the test process comprises at least one of the following: inputting test information, confirming the test information, starting shooting, and displaying CT images; when the node which is taken as an execution subject by the control end comprises a node for inputting test information, the action event also comprises a text input event.
5. The method according to claim 1, wherein before automatically capturing the phantom by the projection end and reconstructing a captured image to obtain a three-dimensional CT image, the method further comprises:
When entering a positioning confirmation node, acquiring address information of the projection end from the control end; and
and sending a positioning confirmation instruction to the projection end according to the address information so that the projection end feeds back a first signal representing the completion of positioning, wherein the first signal is used for triggering the test process to enter a next node of the positioning confirmation node.
6. The test method according to claim 5, wherein before the address information of the projecting end is acquired from the control end, the test method further comprises:
and identifying the state value of the positioning marker bit, directly entering the next node of the positioning confirmation node when the state value is equal to a preset state value, and entering the positioning confirmation node when the state value is not equal to the preset state value.
7. The method according to claim 6, wherein if the status value of the swing flag is not equal to the preset status value, the status value of the swing flag is changed to the preset status value before triggering the test procedure to enter the next node of the swing confirmation node.
8. The method according to claim 1, wherein comparing the first CT image with the pre-acquired corresponding first sample image to obtain an image deviation condition comprises:
Acquiring histograms of the first CT image and the corresponding first sample image;
taking the histogram of the first CT image as a first histogram, taking the histogram of the corresponding first sample image as a second histogram, and calculating the difference value of the pixel number of the corresponding gray value in the first histogram and the second histogram;
calculating a difference value between the first histogram and the second histogram according to the difference value; and
and when the difference value is smaller than a preset difference value, the obtained image deviation condition represents that the image quality is qualified, otherwise, the obtained image deviation condition represents that the image quality is unqualified.
9. The utility model provides a testing arrangement of imaging system, its characterized in that, imaging system includes mutual communication connection's projection end and control end, the projection end is provided with the detection position that is used for putting the phantom, testing arrangement includes:
a memory storing execution instructions; and
a processor executing the execution instructions stored in the memory, causing the processor to perform the test method of the imaging system of any one of claims 1 to 8.
10. An imaging system comprising a projection end and a control end in communication with each other, the projection end being provided with a detection site for placing a phantom, the control end comprising:
A memory storing execution instructions; and
a processor executing the execution instructions stored in the memory, causing the processor to perform the test method of the imaging system of any one of claims 1 to 8.
CN202310976430.6A 2023-08-04 2023-08-04 Imaging system testing method and device and imaging system Active CN116703908B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310976430.6A CN116703908B (en) 2023-08-04 2023-08-04 Imaging system testing method and device and imaging system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310976430.6A CN116703908B (en) 2023-08-04 2023-08-04 Imaging system testing method and device and imaging system

Publications (2)

Publication Number Publication Date
CN116703908A CN116703908A (en) 2023-09-05
CN116703908B true CN116703908B (en) 2023-10-24

Family

ID=87831486

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310976430.6A Active CN116703908B (en) 2023-08-04 2023-08-04 Imaging system testing method and device and imaging system

Country Status (1)

Country Link
CN (1) CN116703908B (en)

Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102760283A (en) * 2011-04-28 2012-10-31 深圳迈瑞生物医疗电子股份有限公司 Image processing method, image processing device and medical imaging equipment
CN103269439A (en) * 2013-05-21 2013-08-28 杭州电子科技大学 OCT image quality objective non-reference type evaluation method
CN103645890A (en) * 2013-11-29 2014-03-19 北京奇虎科技有限公司 Method and device for positioning control part in graphical user interface
CN103903224A (en) * 2012-12-25 2014-07-02 腾讯科技(深圳)有限公司 Digital image banding noise processing method and apparatus
US8837800B1 (en) * 2011-10-28 2014-09-16 The Board Of Trustees Of The Leland Stanford Junior University Automated detection of arterial input function and/or venous output function voxels in medical imaging
CN105336003A (en) * 2015-09-28 2016-02-17 中国人民解放军空军航空大学 Three-dimensional terrain model real-time smooth drawing method with combination of GPU technology
CN105808416A (en) * 2014-12-27 2016-07-27 南车株洲电力机车研究所有限公司 An automatic test method and system for man-machine graphic interaction interfaces
CN105988924A (en) * 2015-02-10 2016-10-05 中国船舶工业综合技术经济研究院 Automatic testing method for non-intrusive type embedded software graphical user interface
US9979956B1 (en) * 2016-06-09 2018-05-22 Oculus Vr, Llc Sharpness and blemish quality test subsystem for eyecup assemblies of head mounted displays
CN109498051A (en) * 2018-12-30 2019-03-22 深圳安科高技术股份有限公司 A kind of CT hospital bed rack automated location calibration method and its system
CN110292390A (en) * 2019-06-28 2019-10-01 南京安科医疗科技有限公司 CT layers of sensitivity curve test body mould Set-up errors bearing calibration
CN110399191A (en) * 2019-06-28 2019-11-01 奇安信科技集团股份有限公司 A kind of program graphic user interface automatic interaction processing method and processing device
CN111751390A (en) * 2020-06-08 2020-10-09 洛阳中信成像智能科技有限公司 Industrial CT image analysis automatic interaction system and use method thereof
CN112749081A (en) * 2020-03-23 2021-05-04 腾讯科技(深圳)有限公司 User interface testing method and related device
CN114359252A (en) * 2022-01-11 2022-04-15 长春汽车工业高等专科学校 Car door window lifting performance detecting system based on machine vision
CN114624266A (en) * 2021-12-31 2022-06-14 深圳明锐理想科技有限公司 Multi-axis repetition precision testing method for X-ray device
CN114864035A (en) * 2022-05-07 2022-08-05 有方(合肥)医疗科技有限公司 Image report generation method, device, system, equipment and storage medium
CN115854938A (en) * 2022-11-07 2023-03-28 重庆大学 CT detection and evaluation method for large-size additive manufacturing ramjet engine
CN115982005A (en) * 2022-11-29 2023-04-18 中国银行股份有限公司 Automatic testing method and device based on artificial intelligence

Patent Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102760283A (en) * 2011-04-28 2012-10-31 深圳迈瑞生物医疗电子股份有限公司 Image processing method, image processing device and medical imaging equipment
US8837800B1 (en) * 2011-10-28 2014-09-16 The Board Of Trustees Of The Leland Stanford Junior University Automated detection of arterial input function and/or venous output function voxels in medical imaging
CN103903224A (en) * 2012-12-25 2014-07-02 腾讯科技(深圳)有限公司 Digital image banding noise processing method and apparatus
CN103269439A (en) * 2013-05-21 2013-08-28 杭州电子科技大学 OCT image quality objective non-reference type evaluation method
CN103645890A (en) * 2013-11-29 2014-03-19 北京奇虎科技有限公司 Method and device for positioning control part in graphical user interface
CN105808416A (en) * 2014-12-27 2016-07-27 南车株洲电力机车研究所有限公司 An automatic test method and system for man-machine graphic interaction interfaces
CN105988924A (en) * 2015-02-10 2016-10-05 中国船舶工业综合技术经济研究院 Automatic testing method for non-intrusive type embedded software graphical user interface
CN105336003A (en) * 2015-09-28 2016-02-17 中国人民解放军空军航空大学 Three-dimensional terrain model real-time smooth drawing method with combination of GPU technology
US9979956B1 (en) * 2016-06-09 2018-05-22 Oculus Vr, Llc Sharpness and blemish quality test subsystem for eyecup assemblies of head mounted displays
CN109498051A (en) * 2018-12-30 2019-03-22 深圳安科高技术股份有限公司 A kind of CT hospital bed rack automated location calibration method and its system
CN110292390A (en) * 2019-06-28 2019-10-01 南京安科医疗科技有限公司 CT layers of sensitivity curve test body mould Set-up errors bearing calibration
CN110399191A (en) * 2019-06-28 2019-11-01 奇安信科技集团股份有限公司 A kind of program graphic user interface automatic interaction processing method and processing device
CN112749081A (en) * 2020-03-23 2021-05-04 腾讯科技(深圳)有限公司 User interface testing method and related device
CN111751390A (en) * 2020-06-08 2020-10-09 洛阳中信成像智能科技有限公司 Industrial CT image analysis automatic interaction system and use method thereof
CN114624266A (en) * 2021-12-31 2022-06-14 深圳明锐理想科技有限公司 Multi-axis repetition precision testing method for X-ray device
CN114359252A (en) * 2022-01-11 2022-04-15 长春汽车工业高等专科学校 Car door window lifting performance detecting system based on machine vision
CN114864035A (en) * 2022-05-07 2022-08-05 有方(合肥)医疗科技有限公司 Image report generation method, device, system, equipment and storage medium
CN115854938A (en) * 2022-11-07 2023-03-28 重庆大学 CT detection and evaluation method for large-size additive manufacturing ramjet engine
CN115982005A (en) * 2022-11-29 2023-04-18 中国银行股份有限公司 Automatic testing method and device based on artificial intelligence

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
The generalized contrast-to-noise ratio:A formal definition for lesion detectability.《IEEE》.2016,全文. *
移动数字X线摄影条件对胸部影像质量的影响;钟朝辉;卢东生;;中国医学影像技术(03);全文 *
针对弱边缘信息的左心室图像分割算法;王南 等;《计算机工程与应用》;全文 *

Also Published As

Publication number Publication date
CN116703908A (en) 2023-09-05

Similar Documents

Publication Publication Date Title
US7953265B2 (en) Method and system for automatic algorithm selection for segmenting lesions on pet images
RU2523929C1 (en) System and method for automated planning of views in 3d brain images
US11657491B2 (en) Learning data collection apparatus, learning data collection method, and program
EP3174467B1 (en) Ultrasound imaging apparatus
CN110584714A (en) Ultrasonic fusion imaging method, ultrasonic device, and storage medium
CN110008947B (en) Granary grain quantity monitoring method and device based on convolutional neural network
KR102202617B1 (en) Method and apparatus for analyzing abdominal disease based on medical image
US20120099776A1 (en) Medical image processing apparatus
CN105468891A (en) Apparatus and method for supporting computer aided diagnosis
JP6821326B2 (en) Information processing equipment, measurement systems, information processing methods and programs
CN112767309A (en) Ultrasonic scanning method, ultrasonic equipment and system
US8447082B2 (en) Medical image displaying apparatus, medical image displaying method, and medical image displaying program
CN116703908B (en) Imaging system testing method and device and imaging system
CN110335248B (en) Medical image focus detection method, device, computer equipment and storage medium
US20090310883A1 (en) Image processing apparatus, method, and program
CN112565615B (en) Method and device for determining trigger point of flying shooting
CN112699919B (en) Wood identification method and device based on three-dimensional convolutional neural network model
WO2018181868A1 (en) Information processing device, rebar counting device, method, and program
CN109597067A (en) Millimeter wave radiometer alignment scans the analysis method and system of low resolution target
CN117705835A (en) X-ray image generating method for detecting defect of object internal member
CN115797729B (en) Model training method and device, motion artifact identification and prompting method and device
CN111899265A (en) Image analysis method, image analysis device, computer equipment and storage medium
KR20220111214A (en) Method, apparatus and computer program for inspection of product based on artificial intelligence
KR102322995B1 (en) Method for artificial intelligence nodule segmentation based on dynamic window and apparatus thereof
JP2003265463A (en) Image diagnosis support system and image diagnosis support program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant