WO2015078148A1 - Procédé et système de balayage assisté par ultrasons - Google Patents

Procédé et système de balayage assisté par ultrasons Download PDF

Info

Publication number
WO2015078148A1
WO2015078148A1 PCT/CN2014/077325 CN2014077325W WO2015078148A1 WO 2015078148 A1 WO2015078148 A1 WO 2015078148A1 CN 2014077325 W CN2014077325 W CN 2014077325W WO 2015078148 A1 WO2015078148 A1 WO 2015078148A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
library
real
matching
time image
Prior art date
Application number
PCT/CN2014/077325
Other languages
English (en)
Chinese (zh)
Inventor
温博
邹耀贤
Original Assignee
深圳迈瑞生物医疗电子股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳迈瑞生物医疗电子股份有限公司 filed Critical 深圳迈瑞生物医疗电子股份有限公司
Publication of WO2015078148A1 publication Critical patent/WO2015078148A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation

Definitions

  • the invention belongs to the field of ultrasonic imaging technology, and relates to an ultrasonic assisted scanning method and system. Background technique
  • Ultrasonic instruments are generally used by doctors to observe the internal structure of the human body.
  • the doctor places the operating probe on the surface of the skin corresponding to the human body, and the ultrasound image of the part can be obtained.
  • Ultrasound has become a major aid for doctors' diagnosis because of its safety, convenience, losslessness and low cost. Due to the complexity of the operation of the ultrasonic instrument, it is necessary to operate the doctor to have a very clear understanding of the spatial structure of various organs and tissues of the human body in order to produce standard cut surfaces of various organs and tissues.
  • Newly-introduced ultrasound doctors, clinicians in emerging fields, private clinics Doctors, some nursing staff, and many other people have the need to improve their knowledge and skills in ultrasound, and they all face the reality of lack of training resources.
  • the advantage of integrated teaching software on the ultrasound system is that the user can learn while practicing, and not only learn the theoretical knowledge through the book, but also greatly improve the training efficiency.
  • the current ultrasound-assisted teaching system can realize the text and text and the teaching according to the steps, but only when the user selects a certain standard aspect, the system displays the standard image of the cut surface and the related graphic interpretation, while scanning The doctor needs to hold the probe in one hand, making it inconvenient to choose the desired cut surface.
  • the existing ultrasound teaching system has poor operation interaction, and the system cannot give feedback according to the user's real-time operation. Therefore, the user cannot judge the correctness of his operation well, and can not obtain the combination of his own situation in the actual operation. Effective guidance. Therefore, it is necessary to help users learn and improve the ultrasonic scanning technology faster and better. Summary of the invention
  • the present invention provides a real-time feedback to the user whether the operation is correct, and automatically according to the operation result, the graphic under the standard cut surface is explained in detail or how to adjust the operation.
  • an ultrasound assisted scanning method includes transmitting ultrasonic waves to a body under test at a probe position, receiving an echo signal reflected by the measured body and generating Pre-real time image;
  • the method also includes:
  • a matching result of the real-time image and the plurality of library images is output, and the standard image is retrieved according to the matching result or an instruction is provided on how to adjust the probe position to obtain a real-time image that matches the standard image.
  • an ultrasound assisted scanning system includes: a probe for transmitting ultrasonic waves to a body under test at a probe position and receiving an echo signal reflected by the body to be tested;
  • a signal processor configured to process the echo signal and generate a current real-time image according to the display
  • a display configured to display the real-time image generated by the output
  • the ultrasound-assisted scanning system further includes:
  • An image library configured to store a plurality of pre-established library images, the plurality of library images including a standard image reflecting a clinical standard cut surface of the tested body;
  • An image matching module configured to perform similarity matching between the generated real-time image and the plurality of library images
  • An output configuration module configured to enable the display to output a matching result of the real-time image and the plurality of library images, and retrieve the standard image according to the matching result or assist in explaining how to adjust the probe position to obtain the standard The image matches the real-time image.
  • the invention can obtain the following beneficial effects: Through the similarity matching between the real-time image and the library image, the present invention can automatically feedback the correctness of the current scanning operation of the user, and determine whether the obtained image is a standard image of a certain tissue or organ; Further, based on the above matching result, the present invention can automatically call up the graphic information corresponding to the clinical standard cut surface of the image when obtaining the standard image, overcoming the cumbersome operation of the prior art requiring the user to manually select the scanning cut surface; When the standard image is not obtained, the user can provide real-time guidance on how to adjust the position of the probe, which greatly improves the learning efficiency of the user, and enables the user to quickly improve the image scanning level.
  • FIG. 1 is an exemplary flow chart of an ultrasonic scanning method of the present invention
  • FIG. 2 is an exemplary flow chart of an ultrasonic scanning method in the first embodiment of the present invention
  • FIG. 3 is an exemplary flow chart of an ultrasonic scanning method in a second embodiment of the present invention.
  • FIG. 4 is a schematic diagram of a position of a feature point on a library image and a corresponding matching point position in a real-time image;
  • FIG. 5 is an exemplary flowchart of an ultrasonic scanning method in a third embodiment of the present invention.
  • FIG. 6 is an exemplary block diagram of an ultrasonic scanning system of the present invention.
  • Figure 7a is an exemplary block diagram of an ultrasonic scanning system in accordance with a particular embodiment of the present invention.
  • Figure 7b is another exemplary block diagram of an ultrasound scanning system in accordance with a particular embodiment of the present invention. detailed description
  • image is commonly used hereinafter, that is, the specific type of image is not described, since the invention belongs to the field of ultrasonic imaging technology, "image” refers in particular to an ultrasound image.
  • the invention provides a system scheme for intelligently assisting an ultrasound device for beginners to rapidly improve the ultrasonic scanning technology and knowledge level, and particularly to provide an ultrasound-assisted scanning method and system.
  • 1 is an exemplary flow chart of an ultrasonic scanning method according to the present invention. The basic steps of the method are as follows: The delayed focus pulse is sent to the probe through the transmitting circuit, and the probe transmits ultrasonic waves to the tested body, and receives the signal after a certain delay. Ultrasound reflected from the body under test.
  • the echo signal enters the beam synthesizer, completes focus delay, weighting and channel summation, and obtains real-time ultrasonic image (hereinafter referred to as real-time image) through signal processing, and then performs image on the currently acquired real-time image and pre-established library information. Matching, outputting image help information required for real-time images according to the matching result, and displaying the information and the real-time image on the display.
  • real-time image real-time ultrasonic image
  • image matching refers specifically to the similarity matching between the real-time image and the gallery information.
  • the process may include the following two steps: A. Building a library and B, similarity matching calculation.
  • the step A belongs to the offline step, that is, the image library is established before the scan, and only the corresponding library image needs to be retrieved from the image library during the actual scan process.
  • the image library to which the present invention relates includes a detailed human face slice ultrasound library.
  • the lateral, longitudinal, and oblique directions of each organ or tissue are respectively Fixed a small distance interval to determine a probe placement position.
  • the probe vertical body reference as the reference swing to a fixed angle at both sides, and store one ultrasound image per swing as a library image in the image library.
  • the above library image should include, in particular, a standard image that reflects the clinical standard section.
  • Each library image in the gallery contains information on the direction of the facet and the angle of the probe's swing.
  • the standard image used in clinical practice (especially the standard sonogram), and a series of library images corresponding to the position of the standard image probe need to be specially marked, and the position of the probe corresponding to the clinical standard section and the contour model of the current probe position are recorded. .
  • This step B is based on the unique features of each library image in the image library, such as a high echo of what shape has a region of a certain image, and the like.
  • the correlation calculation is performed according to the pattern matching algorithm, the image characteristics based on the library image and the real-time image, and the highest correlation is considered to belong to the corresponding matching image.
  • the present invention can implement image matching based on image blocks or feature point based pattern matching methods. The above-described step B is developed in detail in the specific embodiment below.
  • the image matching result of the present invention may include: whether the matching is successful; whether the matching real-time image reflects the cut surface corresponding to the cut surface at the standard probe position; and the real-time image corresponding to the standard probe position reflects whether the cut surface corresponds to the standard probe position Clinical standard section.
  • the "image help information" described above may include the following classifications:
  • the prompt information is given while displaying the reference phantom of the measured body. Prompt that the current scan operation is incorrect and prompt the user how to move the probe in the reference phantom.
  • the image help information at this time includes reference phantom information, matching failure information, and adjustment prompt information.
  • the reference phantom at the output is displayed.
  • the probe position mark (unlike the standard probe position mark) appears at the real-time cut surface position, and the user is informed that there is no standard cut surface at the probe placement position, and the user is prompted to adjust according to the standard probe position on the reference phantom.
  • the image help information at this time includes reference phantom information, real-time & standard probe position information, and adjustment prompt information.
  • Non-clinical standard cut in standard probe position If the real-time image reflected by the real-time image corresponding to the standard probe position does not correspond to the clinical standard cut at the standard probe position (ie, meets the non-clinical standard cut at the standard probe position), this Highlighted on the reference phantom that displays the output (such as highlighting or Special color mark, etc.)
  • the current probe position prompting the user how much the probe should be left or right, and can feedback to the user whether the current probe is placed in the position of the clinical standard cut surface, so that the user can adjust the probe angle according to the feedback.
  • the image help information at this time includes reference phantom information, highlighted probe position information, adjustment prompt information, and clinical standard cut surface judgment information.
  • the matching result indicates that the user's scan operation is correct.
  • the probe mark corresponding to the standard probe position is highlighted on the reference phantom, but also displayed (for example, in the help information area).
  • the graphic information corresponding to the clinical standard section such as standard sonogram, anatomical diagram, scanning technique map, scanning technique, etc. Gp, the image help information at this time includes reference phantom information, highlighted probe position information, and clinical standard cut surface corresponding help information.
  • Embodiment 1 Image block-based matching method (similarity calculation)
  • Fig. 2 it discloses an ultrasonic assisted scanning method according to a first embodiment of the present invention, which implements image matching based on a pattern matching method of an image block.
  • This embodiment can be divided into two phases: a pattern matching phase of determining the matching point (step S12) and a search phase for screening the optimal matching library image (step S13).
  • the matching point matching all the feature points is determined on the real-time image by the similarity calculation, and the library image with the highest similarity is selected according to the matching degree of all the feature points in each library image and the corresponding matching points, where the matching is performed.
  • the degree refers especially to the similarity between the two.
  • the method further includes establishing a preparation phase of the library image feature points before determining the matching points.
  • the preparation phase is usually completed before the start of the matching, and only the determined feature points need to be called during the real-time operation (step S11).
  • a number of feature points are determined for each library image in the image library, and the number of feature points can be set as needed, for example, 20 feature points are selected for each image.
  • the manner in which the feature points are established may be various; for example, a plurality of points with relatively obvious features may be manually determined as feature points, such as edge contour points of various organs in the image or intersections of tissues. Since the data volume of the image library is usually large, the feature points of each library image can be automatically calculated by a specific program.
  • One method for automatically calculating image feature points is as follows:
  • Step 1 Divide the library image into several sub-areas.
  • Ho step 2 establishing a characteristic point in each sub-region ⁇ 3 ⁇ 4, ⁇ 3 ⁇ 4 represents a j-th feature point image.
  • the method for confirming the sub-region feature points can be selected as needed. For example, a gradient or a point with the highest gray scale in the sub-region can be selected as the feature point of the sub-region.
  • Determining the pattern matching phase of the matching point requires separately determining a matching point of each feature point of each library image on the real-time image, which includes the following sub-steps:
  • S121 Take each feature point of each library image in the image library, and determine the search range of each feature point on the current real-time image.
  • the search range is an empirical parameter and can be set as needed.
  • the neighborhood of the corresponding point on the real-time scan image is taken as the search range according to the coordinates of the current feature point.
  • the neighborhood size may be 200*200.
  • S122 Take a feature point ⁇ 3 ⁇ 4 , and determine a neighborhood block of size W*H as a template on the library image centering on the current feature point.
  • S123 Determine a neighboring block with the same size (W*H) as the template size as the center of the plurality of pixels in the region within the determined search range, for subsequent similarity calculation.
  • W*H the same size
  • each pixel in the selected area is centered, and each feature point of each library image is matched one by one.
  • S124 Calculate the similarity value of the template of the library image and the neighborhood block of the real-time image, and take the center of the most similar neighborhood block as the matching point of the template feature point in the real-time image. Specifically, first, a library image is selected, and then a similarity value between a template of each feature point of the image library image and each neighborhood block of the real-time image is calculated, and a template with the highest similarity value is determined for each template of the feature point template. The neighborhood block, according to which the matching points corresponding to all the feature points in each library image are obtained. Reselect another library image Repeat the above steps until you get the matching points for all feature points of all library images.
  • One method of measuring the similarity between a template and a neighborhood block is to calculate the sum of the absolute values of the pixel differences of the template and the neighborhood block.
  • El represents the sum of the absolute values of the pixel differences between the template and the neighborhood block, and is the neighborhood, and I and 1 ⁇ 2 respectively represent the gray values of the pixels in the template and the neighborhood block. It can be seen from equation (1) that the smaller the similarity value, the better the similarity.
  • E2 represents the sum of the correlation coefficients of the template and the pixels of the neighborhood block, ⁇ is the neighborhood, and I and 1 ⁇ 2 respectively represent the gray values of the pixels of the template and the neighborhood block. It can be seen from equation (1) that the greater the similarity value, the better the similarity.
  • the matching point most similar to a feature point is the point where the E1 value is the smallest or the E2 value is the largest. It should also be understood that not every feature point will have a corresponding matching point.
  • a fixed similarity value is assigned to the feature point, and a larger value is given under the pixel difference method, and a correlation coefficient method is assigned to Smaller value.
  • S125 The size of the similarity value most similar to a certain feature point is used as a criterion for determining whether or not a matching point is found. This step is a preferred step.
  • An empirical parameter E_Thre (also called the first matching standard value) can be further defined to determine whether the center of the most similar neighborhood block calculated is indeed the matching point of the feature point in the real-time image. For example, in the case of formula (1) as a metric, if the similarity value of a point is greater than E-Thre, it means that the feature point does not have a corresponding matching point in the real-time image, and formula (2) is used as the metric formula. In the case, if the similarity value of the corresponding point is smaller than E-Thre, it indicates that the feature point does not have a corresponding matching point in the real-time image.
  • the search phase for screening the optimal matching library image requires screening a library image that is most similar to the real-time image in the image library, and the first embodiment is based on all feature points of the single-image image and its real-time image.
  • the similarity of the matching points achieves the above screening.
  • the theoretical basis is based on the fact that if the real-time image and the library image are not the same slice, then the library image will have many feature points in the real-time image without matching points; and if it is the same slice, the library image Most feature points have matching points in the live image. Therefore, a method for searching the library image for the most similar image to the current real-time image is to calculate a sum of similarity values of respective feature points in each library image and corresponding matching points in the real-time image (step S131), that is,
  • E - ⁇ E y (3 )
  • the similarity value can be calculated by the method exemplified by the formula (1) or (2).
  • a fixed constant can be used as its similarity value (ie, the above fixed similarity value) to compensate for the feature point where the matching point does not exist.
  • a larger value can be set as the similarity value of the feature point.
  • equation (2) a smaller value can be set instead of the similarity value of the feature point, for example, set to zero. All library images £ After ten operator to select the optimum value when the image similarity database corresponding to the best similarity to the current real-time image of the image library.
  • Step S132 another empirical parameter threshold (ie, the first slice matching threshold) may be set to determine whether the current library image with the best similarity between the current real-time image and the search is the same slice, and the empirical parameter may be based on the similarity adopted.
  • the metric formula is determined. For example, when the formula (1) is used, if the optimal similarity value is greater than the set matching threshold, the current real-time image is considered to have no image of the same slice in the image library; if the optimality is similar when using equation (2) If the degree value is less than the set face matching threshold, it is considered that the current real-time image does not have the same facet image in the image library.
  • Embodiment 1 of the present invention can screen the most similar library images in the image library by the similarity calculations of steps S12 and S13. However, this method relies too much on the calculation of similarity. Due to the high noise of the ultrasound image, it may affect the screening of the most similar library images in some cases.
  • Embodiment 2 Image block-based matching method (topological property)
  • FIG. 3 it discloses an ultrasonic assisted scanning method according to a second embodiment of the present invention.
  • This embodiment also implements image matching based on a pattern matching method of an image block, and is also divided into two stages: determining a pattern of matching points.
  • the matching phase step S22
  • the search phase of screening the optimal matching library image step S23.
  • the specific process for determining the matching point is the same as that of Embodiment 1, and the description will not be repeated here.
  • the optimal matching library image is screened, the similarity calculation result is no longer used, but the topological property of the feature point is used for screening.
  • For the feature points of each library image its relative positional relationship on the plane is fixed.
  • Step S23 includes the following sub-steps:
  • S234 Compare the determined minimum angle difference with a second slice matching threshold. If the minimum angle difference is smaller than the second slice matching threshold, use the corresponding library image as the optimal matching library image of the real-time image. This step is also a preferred step to improve the accuracy of the matching judgment.
  • the present invention determines the matching points of the feature points of each library image based on the following steps.
  • the specific method described in any one of Embodiments 1 and 2 may be adopted. .
  • the search range corresponding to each feature point is still determined on the real-time image, and the specific process is the same as the above two embodiments.
  • a template of a certain size is determined centering on each feature point on the library image.
  • A is a matrix of m*n (m, n is the size of the template), and the template of A'A is calculated.
  • the characteristic value A 1 ⁇ 2, '", A regarding] ( ⁇ ' is the transpose of matrix A).
  • the image B corresponding to the neighboring block of the same size is taken as the center of the plurality of pixels in the region.
  • each pixel may be selected as the center, or one pixel may be selected as a neighborhood every N pixels. Block center.
  • calculate the neighborhood block eigenvalue of B'B [ A, '", AJ ( ⁇ ' is the transpose of the matrix )).
  • the similarity between the template feature value and the neighborhood block feature value 7 is measured by the method of the above formula (1) or (2), and the similarity calculation result is used as the template of the library image and the similarity value of the neighborhood block of the real-time image.
  • the point with the best similarity can be selected as the matching point of the specific feature point.
  • a matching standard value may be preset; after obtaining the optimal similarity value, it is determined whether the value meets the range defined by the matching standard value, thereby further increasing the accuracy of the matching operation.
  • the template of m*n is simplified to be represented by n eigenvalues, and the eigenvalues can basically express the main characteristics of the template image, and the calculation can be greatly simplified when the similarity calculation is performed by formula (1) or formula (2). the amount.
  • the image block-based matching method only calculates feature points on the library image, but passes the phase in the real-time image. Similarity matches the matching points of the search feature points. Due to the uncertainty of the target position in the real-time image, the search range is generally large, which results in a relatively large amount of calculation.
  • Another method for correlating real-time images and library images is feature point-based image matching. See Example 3 for a detailed description.
  • Embodiment 3 Feature point based matching method
  • Fig. 5 it discloses an ultrasonic assisted scanning method according to a third embodiment of the present invention, which implements image matching based on a pattern matching method of feature points.
  • the embodiment can be divided into three stages: determining a feature point of the library image and the real-time image (step S31), establishing a feature point correspondence relationship (step S32), and searching a search phase of the optimal matching library image (step S33) .
  • the correspondence between the real-time image and the library image feature points is established by the similarity calculation, according to the number of corresponding feature points of each library image, or the similarity of the feature points corresponding to each library image and the real-time image.
  • the sum of the degrees is used to filter out the optimal matching library image.
  • the "corresponding feature point” refers to a point in the library image that has a correspondence with a feature point of the real-time image.
  • Step S31 acquiring feature points of each of the library images and the real-time images of the plurality of library images respectively.
  • Specific implementations include simultaneous calculation of feature points: The same method is used to calculate feature points for real-time images and library images.
  • Commonly used feature points include corner points, inflection points, edge points, etc.
  • Common calculation methods include selecting the point with the largest gradient in a sub-area as the feature point; or calculating the local autocorrelation of four directions for each point in the image, and then selecting The minimum value taken from the correlation result is taken as the characteristic value of the point, or it is further determined whether the value is greater than the experience threshold, and the point is regarded as a feature point only when the experience threshold is exceeded.
  • the specific implementation method further includes calculating the feature points of the library image first, and then calculating the feature points of the real-time image in real time by using the same calculation method.
  • the feature points of the library image can be calculated and stored preferentially, thereby reducing the amount of calculation during the scanning process.
  • Step S32 A large number of feature points are obtained in the real-time image and each library image, but not all feature points have a corresponding relationship. A large part of the real-time image feature points may not have corresponding points in the library image.
  • the purpose of this step is to establish a correspondence between the feature points of the real-time image and the feature points of each library image by the similarity calculation. specifically:
  • Step S321 First, a neighborhood block of the same size is determined centering on each feature point on the real-time image and each library image.
  • Step S322 Calculate similarity values of the two neighborhood blocks of the library image and the real-time image, and use the library image feature points with the similarity value as the corresponding feature points of the real-time image. Specifically, first, a feature point is selected on the real-time image, and then a library image is selected, and the similarity between the neighborhood block of each feature point of the image library image and the neighborhood block of the feature point of the real-time image is calculated. Value, the best value of the similarity value on the library image The sign is the corresponding feature point of the feature point of the real-time image; reselecting another library image repeats the above steps until the corresponding feature point of the feature point on all the library images is obtained.
  • the other is to calculate the similarity between the two by calculating the sum of the correlation coefficients of the pixels of the neighborhood block of the library image and the real-time image:
  • E4 represents the sum of the correlation coefficients of the pixels of the library image and the neighborhood block of the real-time image
  • II and Ir respectively represent the gray values of the pixel points in the neighborhood block of the library image and the real-time image.
  • step S323 This step is a preferred step.
  • the optimal similarity value obtained in step S322 is compared with a predetermined second matching standard value. If the range defined by the matching standard value is exceeded, it is considered that there is virtually no corresponding feature point in the image of the library that matches the feature point of the real-time image.
  • step S32 it can be determined whether each feature point of the real-time image has corresponding feature points in each image, and in particular, corresponding feature point pairs and their similarities can be determined.
  • Step S33 The optimal matching library image is selected according to the number of corresponding feature points of each library image, or the sum of the similarity values of the feature point pairs corresponding to each library image and the real-time image.
  • An optimal matching library image screening method is to select a library image corresponding to the largest number of feature points as a matching image.
  • the experience threshold can be set. If the number of feature points matching the real-time image is less than the threshold, the match is considered unsuccessful, and the current real-time image does not have a corresponding slice in the library image.
  • Another optimal matching library image screening method is to select the optimal library image of the sum of the similarity values of the feature point pairs between the current real-time image and the single-array image as the matching image, and the empirical threshold can also be set to Match the judgment of success.
  • a feature point of a real-time image does not have a corresponding feature point, a fixed similarity value is assigned.
  • the above embodiments 1-3 are all searching for the entire image library.
  • the number of library images in the image library is large, including ultrasound images of various specific tissues or organs under various orientations and imaging angles.
  • the ultrasonic assisted scanning method of the present invention can receive user input before the start of scanning, and determine the organ & tissue name of the tested body to be scanned, thereby calling the library image. Only for multiple library images that satisfy the organ name, the calculation range of image matching is significantly reduced without affecting the accuracy of image matching calculation.
  • the present invention also provides an ultrasound-assisted scanning system that provides real-time feedback of the user's operational correctness through real-time image matching, and guides the user to improve scanning techniques by providing detailed graphic information.
  • the ultrasound assisted scanning system includes an imaging subsystem, a scan assist subsystem, and a display subsystem.
  • the imaging subsystem includes a probe 11 and an imaging module 12; wherein the probe 11 is directly in contact with the body to be tested, and is configured to emit ultrasonic waves to the body to be tested and receive echo signals reflected by the body under test at a certain probe position, and the imaging module 12 Signal processing of the echo signal to obtain a real-time image at the current probe position.
  • the scan assistant subsystem includes an image library 13 and an image matching module 14; the image library 13 is configured to store a plurality of library images that are pre-established, and the image matching module 14 is configured to use the real-time image generated by the imaging module 12 and the image library 13
  • the library image is similarly matched so that the correctness of the ultrasonic scanning operation at the current probe position can be instantly judged or fed back.
  • the display subsystem includes a display 15 and an output configuration module 16; the output configuration module 16 is communicatively coupled to the image matching module 14 and the library image 13 to enable the display 15 to output various matching results after the image matching module 14 obtains the determined matching result.
  • the corresponding graphic information such as but not limited to the current real-time image, the graphic interpretation of adjusting the position of the probe, and the graphic information corresponding to the standard image, and the like.
  • the image matching module 14 of the above-described ultrasonic assisted scanning system includes a library image feature point acquiring unit 141, a real-time image matching point determining unit 142, and an optimal matching library image screening unit 143.
  • the library image feature point acquisition unit 141 is configured to retrieve feature points of each of the library images of the plurality of predetermined library images.
  • the library image feature points described herein are pre-determined to improve the computational speed of the actual scan, but it is not excluded that the present invention can be implemented by determining the library image feature points in real time.
  • the real-time image matching point determining unit 142 is configured to determine a matching point corresponding to each feature point of each library image in the real-time image by the similarity calculation.
  • the invention adopts image block or feature point based pattern matching method to realize similarity matching calculation.
  • the optimal matching library image screening unit 143 selects the library image with the highest similarity among the plurality of library images according to the matching degree of all the feature points in the single-array image and the corresponding matching points, as the optimal matching library image of the current real-time image.
  • the real-time image matching point determining unit 142 first determines a search range corresponding to each feature point on the real-time image according to each feature point determined by the library image feature point acquiring unit 141, for example, taking a neighborhood of the corresponding coordinate point. For the search scope.
  • a template of a certain size is determined centering on each feature point on the library image, and a neighborhood block having the same size as the template is determined centering on a plurality of pixels within the search range.
  • the real-time image matching point determining unit 142 calculates the similarity value of each template of the library image and all neighborhood blocks in the real-time image, which can be based on The method of pixel difference or correlation coefficient described in the paper is performed, and the center of the neighborhood block with the template similarity value is taken as the matching point corresponding to the feature point.
  • the real-time image matching point determining unit 142 first determines a search range corresponding to each feature point on the real-time image according to each feature point determined by the library image feature point acquiring unit 141, for example, taking a neighborhood of the corresponding coordinate point. For the search scope. Then, a template of a certain size is determined centering on each feature point on the library image, and a neighborhood block having the same size as the template is determined centering on a plurality of pixels within the search range.
  • the similarity calculation result may reflect the similarity value between each template of the library image and the neighborhood block of the real-time image, and the real-time image matching point determining unit 142 accordingly
  • the center of the neighborhood block with the similarity of the template similarity value is used as the matching point corresponding to the feature point.
  • the real-time image matching point determining unit 142 also presets a matching standard value. After obtaining the neighborhood block most similar to a template, the similarity values of the two are compared with the matching standard values. Only when the optimal similarity value satisfies the range defined by the matching standard value, it is considered that the current neighborhood block does correspond to the template, and the center thereof is indeed the matching point of the feature points of the template. Different similarity calculation methods have different metrics. For example, when using the pixel difference method, the optimal similarity value should not exceed the matching standard value. On the contrary, when the correlation coefficient method is used, the optimal similarity value should not be less than the matching standard value. .
  • the optimal matching library image screening unit 143 calculates the sum of the similarity values of all the feature points of each library image and the corresponding matching points, thereby determining the total similarity value of the image of the library and the real-time image. It should be noted that not every feature point has a matching point in the real-time image, so a fixed similarity value is assigned to the feature point when the feature point lacks the corresponding matching point. Total similarity in all library images The most optimal value is the optimal matching library image of the current real-time image.
  • the optimal matching library image screening unit 143 performs the filtering based on the relative positional relationship between the feature points. First, the optimal matching library image screening unit 143 calculates the angle between each feature point of each library image and its adjacent feature points, which is recorded as the angle of the feature point; and then calculates the matching point of the feature point on the real-time image. The angle between the matching points of the adjacent feature points is recorded as the angle of the matching points. Similarly, when there is no matching point in a feature point, a preset fixed angle is called as the angle between the matching point of the feature point on the real-time image and the matching point of the adjacent feature point.
  • the optimal matching library image screening unit 143 calculates the sum of the differences between the angles of all the feature points of each library image and the corresponding matching points, and determines the library image corresponding to the minimum angle difference, and the library image is The optimal matching library image of the current real-time image.
  • the optimal matching library image screening unit 143 also presets a face matching threshold. After the minimum angle difference or the optimal similarity value is obtained, the threshold matching threshold is compared. Only when the minimum angle difference or the optimal similarity value satisfies the range defined by the slice matching threshold, the corresponding library images are selected as the optimal matching library images.
  • the image matching module 14 of the ultrasonic assisted scanning system includes a feature point acquisition unit 141', a feature point correspondence establishing unit 142', and an optimal matching library image screening unit 143'.
  • the feature point acquisition sheet 141 ' obtains the feature points of each library image and real-time image respectively, and pays particular attention to determining the feature points of the two methods by the same method.
  • the feature point correspondence establishing unit 142' establishes a correspondence relationship between the feature points of the real-time image and the feature points of each library image by the similarity calculation, thereby determining whether each feature point of the real-time image has a corresponding correspondence in each library image. Feature points, and two feature points with corresponding relationships are recorded as feature point pairs.
  • the optimal matching library image screening unit 143' selects the optimality according to the number of corresponding feature points of each library image, or the sum of the similarity values of the feature point pairs corresponding to each library image and the real-time image. Match the library image.
  • the feature point correspondence establishing unit 142' first determines a neighborhood block of the same size centering on each feature point on each of the real-time image and each library image, and then calculates each neighborhood block of the real-time image.
  • the similarity value with all neighborhood blocks of each library image is obtained as the neighborhood block in each library image that is most similar to a neighborhood block of the real-time image.
  • the center of the two most similar neighborhood blocks in the real-time image and the library image is the feature point pair, and the feature point on the library image is especially called the corresponding feature point.
  • the feature point correspondence establishing unit 142' compares the calculated optimal similarity value with a predetermined matching standard value. If the optimal similarity value satisfies the range defined by the matching standard value, the center of the neighborhood block of the corresponding library image is used as the feature point of the real-time image in the library map. Corresponding feature points on the image.
  • the output configuration module 16 configures the display 15 to output different graphics help information based on the specific matching results of the image matching module 14.
  • the image library 13 is pre-stored with the corresponding data (such as images, probe marks, text guidance information, etc.) required to assist the user in performing an efficient scan.
  • the output configuration module 16 recalls the corresponding data information from the image library 13 based on the received matching result, and displays it on the display 15 in real time to form a good interaction with the user.
  • the specific matching results and graphic help information have been detailed in the above, and will not be repeated here.
  • the ultrasonic assisted scanning system described above includes an image library, an image matching module, and an output configuration module, the above components may not be integrated in the ultrasonic diagnostic apparatus, but function as an ultrasonic diagnostic apparatus.
  • the plug-in is connected to the instrument when the user requires the instrument to provide an auxiliary scan to form the described ultrasound-assisted scanning system.
  • the above detailed development of the ultrasound-assisted scanning method and system reveals the significant advantages of the present invention over the existing teaching system: 1.
  • the real-time feedback mechanism enables the user to know whether the current scanning operation meets the clinical medical requirements; 2.
  • Standard image The automatic call-out mechanism can save the user from manually selecting the required standard image operation, and the overall user-friendliness is stronger.
  • the probe adjustment prompt mechanism enables the user, especially the beginner user, to know how to properly adjust the probe position and improve Learning efficiency.

Abstract

La présente invention concerne un procédé et un système de balayage assisté par ultrasons. Sur la base de la génération d'une image ultrasonore en temps réel, le procédé de la présente invention consiste à réaliser des opérations de mise en correspondance sur l'image acquise en temps réel et l'image de bibliothèque dans la bibliothèque d'images, de telle sorte que l'exactitude des opérations de balayage actuelles est immédiatement renvoyée à l'utilisateur. Le procédé peut en outre délivrer des informations d'aide d'image/texte en fonction d'un résultat de mise en correspondance. Si l'image actuelle en temps réel est l'image de section standard d'un tissu ou d'un organe, les informations d'image/texte pertinentes de la section standard sont automatiquement exportées, permettant ainsi d'améliorer la facilité des opérations. En cas d'échec d'obtention de l'image de section standard, le procédé consiste à demander à l'utilisateur la manière d'ajuster la position de la sonde, permettant ainsi d'obtenir rapidement et avec précision l'image souhaitée. Le procédé et le système de la présente invention améliorent l'interaction d'utilisateur et le guidage d'utilisateur, et peuvent considérablement améliorer l'effet de balayage assisté.
PCT/CN2014/077325 2013-11-28 2014-05-13 Procédé et système de balayage assisté par ultrasons WO2015078148A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201310628549.0 2013-11-28
CN201310628549.0A CN104680481B (zh) 2013-11-28 2013-11-28 一种超声辅助扫查方法和系统

Publications (1)

Publication Number Publication Date
WO2015078148A1 true WO2015078148A1 (fr) 2015-06-04

Family

ID=53198286

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2014/077325 WO2015078148A1 (fr) 2013-11-28 2014-05-13 Procédé et système de balayage assisté par ultrasons

Country Status (2)

Country Link
CN (1) CN104680481B (fr)
WO (1) WO2015078148A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111544035A (zh) * 2020-05-12 2020-08-18 上海深至信息科技有限公司 一种基于增强现实的超声辅助扫查系统及方法
CN112529083A (zh) * 2020-12-15 2021-03-19 深圳开立生物医疗科技股份有限公司 一种超声扫查方法、装置、设备及存储介质

Families Citing this family (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105496454B (zh) * 2015-11-25 2019-01-29 飞依诺科技(苏州)有限公司 用于超声成像设备的数据自动比对方法及系统
WO2017158998A1 (fr) * 2016-03-14 2017-09-21 富士フイルム株式会社 Dispositif de diagnostic par ultrasons et procédé de commande d'un dispositif de diagnostic par ultrasons
CN106580368B (zh) * 2016-11-26 2020-06-23 汕头市超声仪器研究所有限公司 一种用于超声波影像设备的辅助扫查方法
CN106388865A (zh) * 2016-11-26 2017-02-15 汕头市超声仪器研究所有限公司 一种引导人工采集超声波切面图像的方法
CN106510759A (zh) * 2016-11-26 2017-03-22 汕头市超声仪器研究所有限公司 一种半自动超声波诊断方法
CN110087551A (zh) * 2017-04-27 2019-08-02 深圳迈瑞生物医疗电子股份有限公司 一种胎心超声检测方法及超声成像系统
CN107374674A (zh) * 2017-08-28 2017-11-24 深圳开立生物医疗科技股份有限公司 一种超声探头扫查控制方法及装置
CN107569257A (zh) * 2017-09-29 2018-01-12 深圳开立生物医疗科技股份有限公司 超声图像处理方法及系统、超声诊断设备
CN107679574A (zh) * 2017-09-29 2018-02-09 深圳开立生物医疗科技股份有限公司 超声图像处理方法及系统
CN108804547A (zh) * 2018-05-18 2018-11-13 深圳华声医疗技术股份有限公司 超声图像教学方法、装置及计算机可读存储介质
CN109044398B (zh) * 2018-06-07 2021-10-19 深圳华声医疗技术股份有限公司 超声系统成像方法、装置及计算机可读存储介质
CN109276274A (zh) * 2018-10-26 2019-01-29 深圳开立生物医疗科技股份有限公司 一种超声图像标准切面识别及测量方法和超声诊断设备
CN109276275A (zh) * 2018-10-26 2019-01-29 深圳开立生物医疗科技股份有限公司 一种超声图像标准切面提取及测量方法和超声诊断设备
CN109711441B (zh) * 2018-12-13 2021-02-12 泰康保险集团股份有限公司 图像分类方法、装置、存储介质及电子设备
CN109589141A (zh) * 2018-12-28 2019-04-09 深圳开立生物医疗科技股份有限公司 一种超声诊断辅助方法、系统和超声诊断设备
CN117618021A (zh) * 2018-12-29 2024-03-01 深圳迈瑞生物医疗电子股份有限公司 超声成像系统及相关的工作流系统和方法
CN109925002A (zh) * 2019-01-15 2019-06-25 胡秋明 人工智能超声心动图数据采集系统及其数据采集方法
CN109674494B (zh) * 2019-01-29 2021-09-14 深圳瀚维智能医疗科技有限公司 超声扫查实时控制方法、装置、存储介质及计算机设备
CN110269641B (zh) * 2019-06-21 2022-09-30 深圳开立生物医疗科技股份有限公司 一种超声成像辅助引导方法、系统、设备及存储介质
CN110742654B (zh) * 2019-11-05 2020-11-17 深圳度影医疗科技有限公司 一种基于三维超声图像的标准切面的定位和测量方法
CN110974294A (zh) * 2019-12-19 2020-04-10 上海尽星生物科技有限责任公司 超声扫描方法及装置
CN113116378A (zh) * 2019-12-31 2021-07-16 无锡祥生医疗科技股份有限公司 多模态融合成像方法、超声设备及存储介质
CN110960262B (zh) * 2019-12-31 2022-06-24 上海杏脉信息科技有限公司 超声扫查系统、方法及介质
CN113129342A (zh) * 2019-12-31 2021-07-16 无锡祥生医疗科技股份有限公司 多模态融合成像方法、装置及存储介质
CN113116386B (zh) * 2019-12-31 2023-08-04 无锡祥生医疗科技股份有限公司 超声成像引导方法、超声设备及存储介质
CN111110274A (zh) * 2020-02-05 2020-05-08 姜通渊 一种回收b超扫描仪及b超检测方法
CN111449684B (zh) * 2020-04-09 2023-05-05 济南康硕生物技术有限公司 心脏超声标准扫查切面快速获取方法及系统
CN113693625B (zh) * 2021-09-03 2022-11-08 深圳迈瑞软件技术有限公司 超声成像方法和超声成像设备
CN113951923A (zh) * 2021-10-26 2022-01-21 深圳迈瑞动物医疗科技有限公司 兽用超声成像设备、超声成像设备及其扫查方法

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101455576A (zh) * 2007-12-12 2009-06-17 深圳迈瑞生物医疗电子股份有限公司 超声宽景成像的方法、装置与系统
CN101474082A (zh) * 2009-01-16 2009-07-08 北京工业大学 基于有限元形变理论的血管壁弹性分析方法
WO2012124341A1 (fr) * 2011-03-16 2012-09-20 富士フイルム株式会社 Procédé et dispositif de génération d'images photo-acoustiques

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101455576A (zh) * 2007-12-12 2009-06-17 深圳迈瑞生物医疗电子股份有限公司 超声宽景成像的方法、装置与系统
CN101474082A (zh) * 2009-01-16 2009-07-08 北京工业大学 基于有限元形变理论的血管壁弹性分析方法
WO2012124341A1 (fr) * 2011-03-16 2012-09-20 富士フイルム株式会社 Procédé et dispositif de génération d'images photo-acoustiques

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111544035A (zh) * 2020-05-12 2020-08-18 上海深至信息科技有限公司 一种基于增强现实的超声辅助扫查系统及方法
CN112529083A (zh) * 2020-12-15 2021-03-19 深圳开立生物医疗科技股份有限公司 一种超声扫查方法、装置、设备及存储介质

Also Published As

Publication number Publication date
CN104680481B (zh) 2018-09-11
CN104680481A (zh) 2015-06-03

Similar Documents

Publication Publication Date Title
WO2015078148A1 (fr) Procédé et système de balayage assisté par ultrasons
CN110087550B (zh) 一种超声图像显示方法、设备及存储介质
JP6228969B2 (ja) 医療用診断装置およびその計測方法
JP6629094B2 (ja) 超音波診断装置、医用画像処理装置及び医用画像処理プログラム
WO2018129737A1 (fr) Procédé de mesure de paramètres dans une image ultrasonore et système d'imagerie ultrasonore
US20060034513A1 (en) View assistance in three-dimensional ultrasound imaging
CN110087555B (zh) 一种超声设备及其三维超声图像的显示变换方法、系统
BR112015025074B1 (pt) Sistema de imageamento por ultrassom e método para gerar e avaliar vistas bidimensionais padrão a partir de dados de volume ultrassônico tridimensional
US11931201B2 (en) Device and method for obtaining anatomical measurements from an ultrasound image
JP7358457B2 (ja) 超音波画像による脂肪層の識別
CN111281430B (zh) 超声成像方法、设备及可读存储介质
JP7442548B2 (ja) 誘導式超音波撮像
JP7075854B2 (ja) 超音波診断装置及び表示方法
CN110956076B (zh) 基于容积渲染在三维超声数据中进行结构识别的方法和系统
US20170119354A1 (en) Imaging systems and methods for positioning a 3d ultrasound volume in a desired orientation
CN111035408A (zh) 用于超声探头定位反馈的增强的可视化的方法和系统
JP5113548B2 (ja) 超音波画像処理装置
WO2020215485A1 (fr) Procédé, système et dispositif à ultrasons de mesure de paramètre de croissance fœtale
WO2022099705A1 (fr) Procédé d'échographie de fœtus en début de grossesse et système d'échographie
JP2018011635A (ja) 画像処理装置および画像処理方法
WO2015087191A1 (fr) Séquencement d'analyse personnalisée pour une imagerie ultrasonore volumétrique en temps réel
WO2022099704A1 (fr) Procédé d'échographie et système d'échographie de fœtus en milieu et fin de grossesse
JP7299100B2 (ja) 超音波診断装置及び超音波画像処理方法
US20230267618A1 (en) Systems and methods for automated ultrasound examination
JP7169153B2 (ja) 超音波診断装置及びプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14865797

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 28/11/2016)

122 Ep: pct application non-entry in european phase

Ref document number: 14865797

Country of ref document: EP

Kind code of ref document: A1