CN111932536A - Method and device for verifying lesion marking, computer equipment and storage medium - Google Patents

Method and device for verifying lesion marking, computer equipment and storage medium Download PDF

Info

Publication number
CN111932536A
CN111932536A CN202011053234.4A CN202011053234A CN111932536A CN 111932536 A CN111932536 A CN 111932536A CN 202011053234 A CN202011053234 A CN 202011053234A CN 111932536 A CN111932536 A CN 111932536A
Authority
CN
China
Prior art keywords
curves
curve
labeling
lesion
marking
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011053234.4A
Other languages
Chinese (zh)
Other versions
CN111932536B (en
Inventor
郑秋芳
冯豆豆
李海同
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Ping An Smart Healthcare Technology Co ltd
Original Assignee
Ping An International Smart City Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ping An International Smart City Technology Co Ltd filed Critical Ping An International Smart City Technology Co Ltd
Priority to CN202011053234.4A priority Critical patent/CN111932536B/en
Publication of CN111932536A publication Critical patent/CN111932536A/en
Application granted granted Critical
Publication of CN111932536B publication Critical patent/CN111932536B/en
Priority to PCT/CN2021/096198 priority patent/WO2022068228A1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • G06T7/0014Biomedical image inspection using an image reference approach

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

The invention discloses a method and a device for verifying focus labeling, computer equipment and a storage medium. The method comprises the following steps: acquiring two lesion marks of one image in medical image data, and judging whether the type information of the two lesion marks is the same; if the two focus labeling curves are the same, judging whether the labeling curves labeled on the two focuses are non-closed curves; if the two curves are both non-closed curves, judging whether the two marked curves are similar based on a curve judgment rule of a dynamic time warping algorithm to obtain a verification result; if the two marked curves are closed curves, judging whether the closed areas of the two marked curves are overlapped according to an overlap ratio judgment rule to obtain a verification result; and if the two marked curves are respectively a closed curve and a non-closed curve, judging whether the pixels corresponding to the two marked curves are overlapped according to the proportion threshold value to obtain a verification result. The invention is based on an intelligent decision technology, belongs to the field of artificial intelligence, can verify the consistency of the lesion marking by adopting a unified standard, and can greatly improve the efficiency and quality of verifying the lesion marking.

Description

Method and device for verifying lesion marking, computer equipment and storage medium
Technical Field
The invention relates to the technical field of intelligent decision, belongs to an application scene of the verification of focus labeling in a smart city, and particularly relates to a method and a device for verifying focus labeling, computer equipment and a storage medium.
Background
Along with the progress and the development of medical technology, along with the popularization of internet, remote diagnosis obtains the application more and more as a neotype diagnostic method, remote diagnosis can send the medical image data that check-out set gathered to many doctors usually, every doctor all adds the focus mark to same set of medical image data, later through the mode of artifical inspection whether unanimous the focus mark that every doctor added examines, and artifical inspection is because of the influence of subjective judgement, be difficult to keep unified inspection standard, and artifical inspection need consume a large amount of manpower resources, be unfavorable for improving inspection efficiency, the efficiency and the quality that lead to carrying out the uniformity inspection to the focus mark receive the influence. Therefore, the prior art method has the problems of low inspection efficiency and quality when the consistency inspection is carried out on the lesion marking added to the medical image data.
Disclosure of Invention
The embodiment of the invention provides a method, a device, computer equipment and a storage medium for verifying focus marks, and aims to solve the problems of low detection efficiency and low quality when consistency detection is carried out on focus marks added in medical image data in the technical method.
In a first aspect, an embodiment of the present invention provides a method for verifying a lesion marking, including:
receiving medical image data added with focus marks from the user terminal, and acquiring two focus marks of any one image in the medical image data as mark information to be verified;
judging whether the type information of the two lesion marks in the marking information to be verified is the same;
if the type information of the two lesion marks is the same, judging whether the marked curve in the two lesion marks is a closed curve or not to obtain a curve type judgment result;
if the labeling curves in the two lesion labels are both non-closed curves, judging whether the two labeling curves are similar according to a preset curve judgment rule to obtain a verification result of whether the labeling information to be verified is consistent, wherein the curve judgment rule is a judgment rule based on a dynamic time warping algorithm;
if the labeling curves in the two lesion labeling curves are closed curves, judging whether the closed areas corresponding to the two labeling curves are overlapped according to a preset overlap ratio judgment rule to obtain a verification result of whether the labeling information to be verified is consistent;
if the labeling curves in the two lesion labeling curves are respectively a closed curve and a non-closed curve, judging whether pixels corresponding to the two labeling curves are overlapped according to a preset ratio threshold value so as to obtain a verification result whether the labeling information to be verified is consistent.
In a second aspect, an embodiment of the present invention provides a device for verifying a lesion marking, including:
the system comprises a to-be-verified marking information acquisition unit, a to-be-verified marking information acquisition unit and a marking information processing unit, wherein the to-be-verified marking information acquisition unit is used for receiving medical image data added with focus marks from the user terminal and acquiring two focus marks of any one image in the medical image data as to-be-verified marking information;
the type information judging unit is used for judging whether the type information of the two lesion marks in the marking information to be verified is the same;
the curve type judging unit is used for judging whether the marked curves in the two lesion marks are closed curves or not to obtain a curve type judging result if the type information of the two lesion marks is the same;
the first verification unit is used for judging whether the two labeling curves are similar according to a preset curve judgment rule to obtain a verification result whether the labeling information to be verified is consistent or not if the labeling curves in the two lesion labels are both non-closed curves, wherein the curve judgment rule is a judgment rule based on a dynamic time warping algorithm;
the second verification unit is used for judging whether closed areas corresponding to the two labeling curves are overlapped according to a preset overlap ratio judgment rule if the labeling curves in the two lesion labels are closed curves so as to obtain a verification result of whether the labeling information to be verified is consistent;
and the third verification unit is used for judging whether pixels corresponding to the two labeling curves are overlapped or not according to a preset ratio threshold value if the labeling curves in the two lesion labeling curves are respectively a closed curve and a non-closed curve so as to obtain a verification result whether the labeling information to be verified is consistent or not.
In a third aspect, an embodiment of the present invention further provides a computer device, which includes a memory, a processor, and a computer program stored on the memory and executable on the processor, and when the processor executes the computer program, the processor implements the method for verifying a lesion marking according to the first aspect.
In a fourth aspect, the present invention further provides a computer-readable storage medium, wherein the computer-readable storage medium stores a computer program, and the computer program, when executed by a processor, causes the processor to execute the method for lesion marking verification according to the first aspect.
The embodiment of the invention provides a method and a device for verifying lesion marking, computer equipment and a storage medium. Acquiring two lesion marks of any image in medical image data as marking information to be verified, and judging whether the type information of the two lesion marks is the same; if the two focus marks are the same, judging whether the marking curves in the two focus marks are non-closed curves; if the two marked curves are both non-closed curves, judging whether the two marked curves are similar according to a curve judgment rule based on a dynamic time warping algorithm to obtain a verification result of whether the two marked curves are consistent; if the two marked curves are closed curves, judging whether the closed areas of the two marked curves are overlapped according to an overlap ratio judgment rule to obtain a verification result of whether the closed areas of the two marked curves are overlapped; and if the two marked curves are respectively a closed curve and a non-closed curve, judging whether the pixels corresponding to the two marked curves are overlapped or not according to the proportion threshold value to obtain a verification result of whether the pixels are consistent or not. By the method, the consistency of the lesion marking can be verified by adopting the unified standard, and the efficiency and the quality of verifying the lesion marking can be greatly improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic flowchart of a method for verifying a lesion marking according to an embodiment of the present invention;
fig. 2 is a schematic view of an application scenario of a method for verifying a lesion marking according to an embodiment of the present invention;
fig. 3 is another schematic flow chart illustrating a method for validating a lesion marking according to an embodiment of the present invention;
fig. 4 is a sub-flow diagram of a method for verifying a lesion marking according to an embodiment of the present invention;
FIG. 5 is a schematic view of another sub-process of a method for verifying a lesion marking according to an embodiment of the present invention;
FIG. 6 is a schematic view of another sub-process of a method for verifying a lesion marking according to an embodiment of the present invention;
FIG. 7 is a schematic view of another sub-process of a method for verifying a lesion marking according to an embodiment of the present invention;
FIG. 8 is a schematic view of another sub-process of a method for verifying a lesion marking according to an embodiment of the present invention;
fig. 9 is a schematic block diagram of a lesion marking verification apparatus according to an embodiment of the present invention;
FIG. 10 is a schematic block diagram of a computer device provided by an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in the specification of the present invention and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
Referring to fig. 1 and fig. 2, fig. 1 is a schematic flow chart of a method for verifying a lesion marking according to an embodiment of the present invention, and fig. 2 is a schematic application scenario diagram of the method for verifying a lesion marking according to an embodiment of the present invention; the method for verifying the lesion marking is applied to a management server 10, the method is executed through application software installed in the management server 10, a plurality of user terminals 20 are connected with the management server 10 through a network to realize data information transmission, the management server 10 is a server end used for verifying whether two lesion markings of the same image are consistent, the user terminals 20 are terminal equipment for a user to add the lesion markings to medical image data from the management server, such as a desktop computer, a notebook computer, a tablet computer or a mobile phone, and the user can be a doctor. As shown in FIG. 1, the method includes steps S110 to S160.
S110, receiving medical image data added with focus marks from the user terminal, and acquiring two focus marks of any one image in the medical image data as marking information to be verified.
And receiving medical image data added with focus marks from the user terminal, and acquiring two focus marks of any one image in the medical image data as to-be-verified mark information. The medical image data is image information acquired by the detection equipment, the medical image data comprises at least one image, a user receives the medical image data in the diagnosis process, and sequentially adds focus marks in each image contained in the medical image data to obtain the medical image data added with the focus marks, wherein the user can be a doctor for diagnosis, and each image in the medical image data is added with a plurality of focus marks. Specifically, two lesion marks of any image contained in the medical image data can be sequentially acquired as marking information to be verified, and consistency check is performed on the marking information to be verified, that is, whether the two lesion marks contained in the marking information to be verified are consistent or not is checked.
Specifically, the lesion marking may include type information and a marking curve, where the type information is information of a specific lesion type added in the image, and one of a plurality of preset lesion types may be selected as corresponding type information and added to the image, for example, the type information may be hydrops, infection, bleeding, and the like, the marking curve is information added to the image to mark a lesion region, and the marking curve may be a closed curve or a non-closed curve, and the marking curve may be any shape such as a circle, an ellipse, a polygon, or a broken line, for example, the closed curve may be used to mark a block-shaped lesion region, and the non-closed curve may be used to mark a linear or a strip-shaped lesion region.
In an embodiment, as shown in fig. 3, step S1101 is further included before step S110.
S1101, sending the medical image data without adding the focus marks to a plurality of user terminals to obtain the focus marks added by the plurality of user terminals by a plurality of corresponding users, so as to obtain the medical image data with the added focus marks.
Specifically, after receiving medical image data, the user terminal can display any one image in the medical image data in a display device of the user terminal, acquire a lesion type selected by a user from a plurality of preset lesion types as corresponding type information and add the type information to the image; and periodically collecting the position points of the brush in the display device in the time period when the brush is in contact with the display device according to preset interval time, and adding connecting lines of the position points into the image as corresponding marking curves.
When a labeled curve is obtained, a user can hold a painting brush to draw a curve on a display device, the display device is used for displaying an image, the display device collects position points of the painting brush in the display device at intervals (for example, 0.1 second), a plurality of position points which are correspondingly collected in a time period when the painting brush is in contact with the display device are obtained, each position point corresponds to a collection time, and the obtained position points are connected to form a labeled curve.
And S120, judging whether the type information of the two lesion marks in the marking information to be verified is the same.
And judging whether the type information of the two lesion marks in the marking information to be verified is the same. When checking whether two lesion marks contained in marking information to be verified are consistent, firstly, whether the type information of the two lesion marks is the same needs to be judged. If the type information of the two focus marks is different, obtaining a verification result that the marking information to be verified is inconsistent; and if the type information of the two focus marks is the same, carrying out next detection on the marked curves of the two focus marks.
S130, if the type information of the two lesion marks is the same, judging whether the marked curves in the two lesion marks are closed curves to obtain a curve type judgment result.
And if the type information of the two lesion marks is the same, judging whether the marked curve in the two lesion marks is a closed curve or not to obtain a curve type judgment result. After a marked curve is judged, whether the marked curve is a closed curve or a non-closed curve can be determined; judging the curve type of the marked curves marked on the two focuses to obtain curve type judgment results, wherein the curve type judgment results comprise three types: the two marked curves are both closed curves, the two marked curves are both non-closed curves or one marked curve is a closed curve and the other marked curve is a non-closed curve. Or directly acquiring the field value used for recording the shape of the labeled curve in the lesion labeling to obtain the information that the labeled curve in each lesion labeling is a closed curve or a non-closed curve. For example, the labeled curves of the circle, the ellipse or the polygon are closed curves, and the labeled curve of the broken line is a non-closed curve.
In one embodiment, as shown in fig. 4, step S130 includes sub-steps S131 and S132.
S131, determining a starting point and an end point of the labeling curve according to the acquisition time of the position point in the labeling curve.
And determining the starting point and the end point of the labeling curve according to the acquisition time of the position point in the labeling curve. Specifically, the labeled curve is composed of a plurality of position points, the starting point and the end point of the labeled curve can be obtained according to the acquisition time corresponding to the position points, specifically, the position point with the earliest acquisition time in the labeled curve is determined as the starting point, the position point with the latest acquisition time is determined as the end point, and the starting point and the end point of the labeled curve can be obtained.
S132, judging whether the distance difference between the starting point and the end point is smaller than a preset distance threshold value or not to obtain a judgment result whether the labeled curve is a closed curve or not.
And judging whether the distance difference between the starting point and the end point is smaller than a preset distance threshold value or not to obtain a judgment result whether the labeled curve is a closed curve or not. Judging whether the distance difference between the starting point and the end point in the labeled curve is smaller than a preset distance threshold value to judge whether the starting point and the end point of the labeled curve are overlapped, and if the distance difference between the starting point and the end point of a certain labeled curve is smaller than the distance threshold value, judging that the starting point and the end point of the labeled curve are overlapped, wherein the labeled curve is a closed curve; if the distance difference between the starting point and the end point of a certain marked curve is not smaller than the distance threshold, the starting point and the end point of the marked curve are judged to be not coincident, and the marked curve is not a closed curve. The preset distance threshold may be a number of pixels or a length value.
For example, if the preset distance threshold is set to 5 pixels, if the distance difference between the starting point and the ending point of a certain labeled curve is less than 5 pixels, a judgment result that the labeled curve is a closed curve is obtained; otherwise, obtaining the judgment result that the labeled curve is a non-closed curve.
The curve type judgment is respectively carried out on the marking curves in the two lesion markings by adopting the method, and the final curve type judgment result can be obtained by integrating the judgment results of the two marking curves.
And S140, if the labeling curves in the two lesion labels are both non-closed curves, judging whether the two labeling curves are similar according to a preset curve judgment rule to obtain a verification result whether the labeling information to be verified is consistent, wherein the curve judgment rule is a judgment rule based on a dynamic time warping algorithm.
If the labeling curves in the two lesion labels are both non-closed curves, judging whether the two labeling curves are similar according to a preset curve judgment rule to obtain a verification result of whether the labeling information to be verified is consistent, wherein the curve judgment rule is a judgment rule based on a dynamic time warping algorithm. The curve judgment rule comprises a path similarity threshold, a span threshold, an offset threshold, a dynamic time warping algorithm, a span acquisition rule, an offset distance acquisition rule, a similarity calculation formula and a similarity threshold. Specifically, firstly, acquiring a span difference of two marked curves according to a span acquisition rule; if the span difference is not larger than the span threshold value, acquiring the offset distance of the two marked curves according to an offset distance acquisition rule; if the offset distance is greater than the offset threshold, acquiring the path similarity of the two marked curves according to a dynamic time warping algorithm; if the path similarity is greater than the threshold of the path similarity, calculating the similarity of the two labeling curves according to a similarity calculation formula, wherein the similarity is calculated based on three dimensions of span difference, path similarity and offset distance, and if the similarity is greater than the threshold of the similarity, the verification results of the labeling information to be verified are consistent. And if the span difference is smaller than the span threshold, or the path similarity is not larger than the path similarity threshold, or the offset distance is not larger than the offset threshold, or the similarity is not larger than the similarity threshold, the verification result of the verification marking information is inconsistent.
The Dynamic Time Warping (DTW) algorithm adopted in the scheme obtains the path similarity of two labeled curves, the Dynamic Time Warping (DP) algorithm idea is dynamically planned, and the path similarity of two curves with higher path overlapping degree can be calculated based on the Dynamic Time Warping algorithm.
In an embodiment, as shown in fig. 5, step S140 includes sub-steps S141, S142, S143, and S144.
S141, acquiring a span difference of the two labeled curves according to the span acquisition rule, and judging whether the span difference is not greater than the span threshold value.
And acquiring the span difference of the two labeled curves according to the span acquisition rule, and judging whether the span difference is not greater than the span threshold value. In order to obtain the path similarity of the two marked curves, the path contact ratio of the two marked curves needs to be judged first, and whether the two marked curves have higher path contact ratio can be judged by obtaining the span difference of the two marked curves. The span difference of the two marked curves comprises a transverse span difference and a longitudinal span difference, and the span threshold comprises a transverse span threshold and a longitudinal span threshold.
In an embodiment, as shown in fig. 6, step S141 includes sub-steps S1411 and S1442.
S1411, respectively acquiring the transverse span difference and the longitudinal span difference of the two marked curves according to the span acquisition rule; s1442, judging whether the transverse span difference is not larger than a transverse span threshold value and whether the longitudinal span difference is not larger than a longitudinal span threshold value, so as to obtain a judgment result whether the span difference is not larger than the span threshold value.
The transverse span is the span of a marked curve in the abscissa direction, and the longitudinal span is the span of a marked curve in the ordinate direction. Respectively obtaining the transverse spans of two marked curves, and calculating two marked curvesMarking the ratio of the smaller value and the larger value of the transverse span in the curve to obtain the transverse span difference XrDifference in longitudinal span YrSimilar to the lateral span difference. If the transverse span difference is not greater than the transverse span threshold and the longitudinal span difference is not greater than the longitudinal span threshold, obtaining a judgment result that the span difference is not greater than the span threshold; otherwise, obtaining the judgment result that the span difference is larger than the span threshold value. Wherein, XrAnd YrAll values of (1, 0)]。
And S142, if the span difference is not greater than the span threshold, acquiring the offset distance of the two marked curves according to the offset distance acquisition rule, and judging whether the offset distance is greater than the offset threshold.
If the span difference is not larger than the span threshold, acquiring the offset distance of the two marked curves according to the offset distance acquisition rule, and judging whether the offset distance is larger than the offset threshold. When calculating the offset distance, it may be determined whether the number of the position points included in the two labeling curves is equal, and if the number of the position points is not equal, the shorter one of the labeling curves may be uniformly filled to make the number of the position points included in the two labeling curves equal. Specifically, the labeled curve with a large number of position points is marked as curve A, and the number of the labeled curve is CAThe smaller number of labeled curves is denoted as curve B, the number of which is CBPoint number difference CD=CA-CBFilling C between some two position points (for example, the nth position point and the (n + 1) th position point) in the curve BP=ceil(CD/CB) Dots, where ceil is rounded up and filled every other (L)n+1-Ln)/ CDIs increased by one point, Ln+1-LnThe distance difference between the nth position point and the (n + 1) th position point is filled with C between every two pointsPAnd (4) points. The offset distance acquisition rule is based on the acquisition rule of the Fourier distance, using
Figure 352139DEST_PATH_IMAGE001
And
Figure 331596DEST_PATH_IMAGE002
the sequential sets of the position points in the curve A and the curve B are respectively represented by
Figure 111333DEST_PATH_IMAGE003
And
Figure 568859DEST_PATH_IMAGE004
(ii) a It can be further obtained that the sequential set of the following position point pairs can be represented by formula (4)
L=
Figure 445548DEST_PATH_IMAGE005
(4);
Wherein, ai=1,2,… A,ai=1, 2, … B (B = a), for any one aiAll can obtain ai+1=aiOr a isi+1=ai+1 and bi+1=bi+ 1; the length L between the position point pairs of the curve a and the curve B can be defined as the accumulated value of the maximum euclidean distances of each position point pair, i.e. L | =
Figure 268011DEST_PATH_IMAGE006
The corresponding Frechst distances of the obtained curve A and curve B are
Figure 98826DEST_PATH_IMAGE007
Offset distance frd =
Figure 360043DEST_PATH_IMAGE008
/ min(LA,LB),frdHas a value range of (0, 1)],min(LA,LB) And judging whether the obtained offset distance is greater than the offset threshold value or not for the minimum value of the Euclidean distance in the curve A and the Euclidean distance in the curve B.
In one embodiment, as shown in fig. 7, step S143 further includes steps S1431 and S1432.
S1431, acquiring a time sequence corresponding to each labeling curve according to the time axis of each labeling curve and judging whether the two time sequences are unified.
And acquiring a time sequence corresponding to each labeling curve according to the time axis of each labeling curve and judging whether the two time sequences are unified. Each marking curve is composed of a plurality of position points, each position point is compared with one acquisition time, the position points can be represented by adopting pixel coordinates, the speed of the painting brush moving on the display device along the horizontal direction is likely to change, a fixed interval time is respectively arranged between the upper position point and the lower position point, the upper position point and the lower position point also form a group of front and rear position points, the projection distances of the front and rear position points on the horizontal coordinate are likely to change, the projection distances of the front and rear position points on the horizontal coordinate in one marking curve are combined to form a time sequence of the marking curve, whether the two time sequences are unified is judged, only the projection distances of the two groups of front and rear position points corresponding to the two time sequences on the horizontal coordinate are required to be judged to be equal, if the number of the front and rear position points contained in the two time sequences is not equal, the number of front and rear position points in the time series with the smaller number is used as a comparison reference.
For example, information corresponding to one time series is shown in table 1.
Front and rear position point numbering 1 2 3 4 5 6
Projection distance/number of pixels 4 7 3 6 5 4
TABLE 1
And S1432, if the two time sequences are not unified, adjusting the two labeled curves to enable the time sequences of the two labeled curves to be unified.
And if the two time sequences are not unified, adjusting the two labeled curves to make the time sequences of the two labeled curves unified. If the two time series are not uniform, the two labeled curves can be adjusted to uniform the time series. For example, if the projection distances of two sets of front and rear position points corresponding to the two time series are 3 pixels and 5 pixels, respectively, the projection distances of the two sets of front and rear position points can be adjusted to 3 pixels, and the compression adjustment is performed according to a section of curve adjusted by the front and rear position points in the labeled curve, that is, the position points in the labeled curve are adjusted in the compression adjustment process, and the time series of the two labeled curves obtained after the adjustment are unified. If the numbers of the front and rear position points included in the two time series are not equal, the number of the front and rear position points in the time series with the smaller number is used as the alignment reference.
S143, if the offset distance is larger than the offset threshold, obtaining the path similarity of the two labeled curves according to the dynamic time warping algorithm, and judging whether the path similarity is larger than the path similarity threshold.
If the offset distance is greater than the offset threshold, obtaining the path similarity of the two labeled curves according to the dynamic time warping algorithm, and judging whether the path similarity is greater than the path similarity threshold. The dynamic time warping algorithm can obtain the shortest warping distance and the longest warping distance between the two labeled curves, calculate to obtain the path similarity according to a calculation formula in the dynamic time warping algorithm, and judge whether the path similarity is greater than a path similarity threshold value.
For example, the two labeled curves are curve a and curve B, respectively, and their shortest regular distances can be represented by formula (1), and the longest regular distances can be represented by formula (2).
D(i,j)=Dist(i,j)+min{D(i-1,j),D(i,j-1), D(i-1,j-1)} (1);
D’(i,j)=Dist(i,j)+max{D ’(i-1,j),D’(i,j-1), D’(i-1,j-1)} (2);
Wherein i and j are positive integers greater than or equal to 1, Dist (i, j) is a path distance (a projection distance of two position points on a vertical coordinate) between the ith position point of the curve a and the jth position point of the curve B, D (i, j) represents a regular distance between the first i position point of the curve a and the first jth position point of the curve B, and iteration is performed based on a formula (1) to obtain the shortest regular distance between the two curves: the calculation starts from i =1 and j =1 (min { D (i-1, j), D (i, j-1), D (i-1, j-1) } =0 at this time because the position point corresponding to i-1 or j-1 does not exist.
The corresponding path similarity can be calculated based on the obtained shortest regular distance and the longest regular distance, and the formula for calculating the path similarity can be represented by formula (3).
Ds =1- D(i,j)/D’(i,j) (3);
Wherein Ds is the calculated path similarity, and the numeric area of Ds is (0, 1).
And S144, if the path similarity is not smaller than the path similarity threshold, calculating to obtain the similarity according to the similarity calculation formula, the span difference, the path similarity and the offset distance, and judging whether the similarity is larger than the similarity threshold so as to obtain the verification result of the to-be-verified labeling information to be consistent.
If the path similarity is not smaller than the path similarity threshold, calculating to obtain the similarity according to the similarity calculation formula, the span difference, the path similarity and the offset distance, and judging whether the similarity is larger than the similarity threshold so as to obtain the verification result of the to-be-verified labeling information to be consistent. Specifically, the similarity calculation formula can be expressed by formula (5):
S=s1×Xr+ s2×Yr+ s3×Ds + s4×(1-frd) (5);
wherein s is1、s2、s3And s4All the parameters are preset parameter values in a similarity calculation formula, and specifically, the parameter values can be set to meet the following conditions: s1+s2+s3+s4=1, the obtained similarity S has a value range of (0, 1)]. And judging whether the similarity obtained by calculation is greater than a similarity threshold, if so, obtaining a judgment result that the two marking curves are similar, namely that the verification results of the marking information to be verified are consistent, and if not, obtaining a judgment result that the two marking curves are dissimilar, namely that the verification results of the marking information to be verified are inconsistent. For example, the similarity threshold may be set to 0.8.
S150, if the labeling curves in the two lesion labeling curves are closed curves, judging whether the closed areas corresponding to the two labeling curves are overlapped according to a preset overlap ratio judgment rule to obtain a verification result whether the labeling information to be verified is consistent.
If the labeling curves in the two lesion labeling curves are closed curves, judging whether the closed areas corresponding to the two labeling curves are overlapped according to a preset overlap ratio judgment rule to obtain a verification result whether the labeling information to be verified is consistent. The contact ratio judgment rule comprises a contact ratio calculation formula and a contact ratio threshold value. If the labeling curves in the two lesion labeling curves are closed curves, the corresponding contact ratio can be obtained according to the contact ratio judgment rule, and whether the closed areas corresponding to the two labeling curves are overlapped or not is judged based on the contact ratio.
In one embodiment, as shown in fig. 8, step S150 includes sub-steps S151, S152, and S153.
And S151, pixel filling is carried out on the closed areas corresponding to the two labeling curves to obtain two corresponding pixel images.
And filling pixels in the closed areas corresponding to the two labeling curves to obtain two corresponding pixel images. And acquiring closed areas corresponding to the two labeling curves, and performing pixel filling in the closed areas to obtain two corresponding pixel images, wherein the obtained pixel images have the same size (the size of the pixel images is the same as that of the image in the medical image data), the value of the filled pixel points in the pixel images is 1, and the value of the unfilled pixel points is 0.
And S152, calculating to obtain corresponding coincidence degrees according to a coincidence degree calculation formula, overlapped pixels between the two pixel images and effective pixels corresponding to the two pixel images respectively.
And calculating to obtain corresponding coincidence degrees according to a coincidence degree calculation formula, overlapped pixels between the two pixel images and effective pixels corresponding to the two pixel images respectively. Overlapping the two pixel images, accumulating the values of the pixel points in the two pixel images, wherein the value of the pixel point is 2, namely the pixel point is an overlapped pixel in the two pixel images, counting the number of the pixel points with the value of 2, and the pixel point with the value of 1 in any one pixel image is an effective pixel in the pixel image. The contact ratio calculation formula can be expressed by formula (6):
Sx=2×S2/(SA+SB) (6);
wherein Sx is the obtained contact ratio, and the value range of Sx is (0, 1)],S2As the number of overlapping pixels, S, in the two pixel imagesAIs the number of effective pixels, S, of the pixel image ABIs the number of effective pixels of the pixel image B.
And S153, judging whether the contact ratio is not less than the contact ratio threshold value so as to obtain whether the closed areas corresponding to the two marking curves are overlapped so as to obtain the consistent verification result of the marking information to be verified.
Judging whether the obtained contact ratio is not less than a contact ratio threshold value, if the contact ratio is not less than the contact ratio threshold value, obtaining a judgment result that the closed areas corresponding to the two marking curves are overlapped, namely obtaining the verification result of the marking information to be verified to be consistent; otherwise, a non-coincident judgment result is obtained, namely the verification result of the verification marking information is inconsistent. For example, the threshold value of the degree of coincidence may be preset to 80%.
And S160, if the labeling curves in the two lesion labeling curves are respectively a closed curve and a non-closed curve, judging whether pixels corresponding to the two labeling curves are overlapped according to a preset ratio threshold value so as to obtain a verification result whether the labeling information to be verified is consistent.
If the labeling curves in the two lesion labeling curves are respectively a closed curve and a non-closed curve, judging whether pixels corresponding to the two labeling curves are overlapped according to a preset ratio threshold value so as to obtain a verification result whether the labeling information to be verified is consistent. If the marked curve in one focus marking is a closed curve and the other is a non-closed curve, the pixel proportion of the two marked curves can be obtained, and whether the pixel proportion is not less than the proportion threshold value is judged, so that a judgment result of whether the pixels corresponding to the two marked curves are overlapped is obtained.
The method comprises the following specific steps: filling pixels in a closed area corresponding to the closed curve to obtain a corresponding pixel image; acquiring the proportion of overlapped pixels between the non-closed curve and the pixel image in the non-closed curve to obtain a compared pixel proportion; judging whether the pixel ratio is not less than the ratio threshold value or not to obtain a verification result whether the marking information to be verified is consistent or not
And performing pixel filling on the closed region corresponding to the closed curve to obtain a corresponding pixel image, acquiring the closed region corresponding to the labeling curve of the closed curve, and performing pixel filling in the closed region to obtain a corresponding pixel image, wherein the obtained pixel images have the same size (the size of the pixel image is the same as that of the image in the medical image data), the value of the filled pixel points in the pixel image is 1, and the value of the unfilled pixel points in the pixel image is 0.
And acquiring the number of overlapped pixels between the non-closed curve and the pixel image, and calculating the ratio of the number of the overlapped pixels to the number of pixels in the non-closed curve, namely calculating the ratio of the overlapped pixels in the non-closed curve, wherein the obtained ratio is the pixel ratio.
Judging whether the obtained pixel proportion is not less than a proportion threshold value, if the pixel proportion is not less than the proportion threshold value, obtaining a judgment result of overlapping pixels corresponding to the two marking curves, namely obtaining the verification result of the marking information to be verified to be consistent; otherwise, a judgment result that pixels corresponding to the two labeling curves are not overlapped is obtained, that is, the verification result of the labeling information to be verified is inconsistent.
The technical method can be applied to application scenes such as intelligent medical treatment and the like including verification on consistency of lesion marking information, so that the construction of a smart city is promoted.
In the method for verifying the lesion marking provided by the embodiment of the invention, two lesion markings of any image in medical image data are acquired as marking information to be verified, and whether the type information of the two lesion markings is the same or not is judged; if the two focus marks are the same, judging whether the marking curves in the two focus marks are non-closed curves; if the two marked curves are both non-closed curves, judging whether the two marked curves are similar according to a curve judgment rule based on a dynamic time warping algorithm to obtain a verification result of whether the two marked curves are consistent; if the two marked curves are closed curves, judging whether the closed areas of the two marked curves are overlapped according to an overlap ratio judgment rule to obtain a verification result of whether the closed areas of the two marked curves are overlapped; and if the two marked curves are respectively a closed curve and a non-closed curve, judging whether the pixels corresponding to the two marked curves are overlapped or not according to the proportion threshold value to obtain a verification result of whether the pixels are consistent or not. By the method, the consistency of the lesion marking can be verified by adopting the unified standard, and the efficiency and the quality of verifying the lesion marking can be greatly improved.
The embodiment of the present invention further provides a device for verifying a lesion marking, which is used for performing any embodiment of the method for verifying a lesion marking. Specifically, referring to fig. 9, fig. 9 is a schematic block diagram of a lesion marking verification apparatus according to an embodiment of the present invention. The lesion marking verification device may be disposed in the management server 10.
As shown in fig. 9, the lesion marking verification apparatus 100 includes a marking information to be verified acquisition unit 110, a type information judgment unit 120, a curve type judgment unit 130, a first verification unit 140, a second verification unit 150, and a third verification unit 160.
A to-be-verified marking information obtaining unit 110, configured to receive medical image data with a lesion marking added from the user terminal, and obtain two lesion markings of any one image in the medical image data as to-be-verified marking information.
In an embodiment, the apparatus 100 for validating a lesion marking further comprises sub-units: a medical image data transmitting unit.
And the medical image data sending unit is used for sending the medical image data without adding the focus marks to a plurality of user terminals so as to obtain the focus marks added by the plurality of user terminals by a plurality of corresponding users and obtain the medical image data with the added focus marks.
A type information determining unit 120, configured to determine whether the type information of the two lesion markings in the marking information to be verified is the same.
A curve type determining unit 130, configured to determine whether a labeled curve in two of the lesion labels is a closed curve to obtain a curve type determination result if the type information of the two lesion labels is the same.
In an embodiment, the curve type determining unit 130 includes sub-units: a cell and a distance difference judgment cell are determined.
The determining unit is used for determining a starting point and an end point of the labeling curve according to the acquisition time of the position point in the labeling curve; and the distance difference judging unit is used for judging whether the distance difference between the starting point and the end point is smaller than a preset distance threshold value so as to obtain a judgment result whether the labeled curve is a closed curve.
The first verification unit 140 is configured to, if the labeling curves in the two lesion markings are both non-closed curves, determine whether the two labeling curves are similar according to a preset curve determination rule to obtain a verification result indicating whether the labeling information to be verified is consistent, where the curve determination rule is a determination rule based on a dynamic time warping algorithm.
In one embodiment, the first verification unit 140 includes sub-units: the device comprises a span difference judging unit, an offset distance judging unit, a path similarity judging unit and a similarity judging unit.
And the span difference judging unit is used for acquiring the span difference of the two marked curves according to the span acquisition rule and judging whether the span difference is not greater than the span threshold value.
In one embodiment, the span difference determination unit includes a sub-unit: the device comprises a span difference acquisition unit and a judgment unit.
The span difference acquisition unit is used for respectively acquiring the transverse span difference and the longitudinal span difference of the two marked curves according to the span acquisition rule; and the judging unit is used for judging whether the transverse span difference is not larger than a transverse span threshold value and whether the longitudinal span difference is not larger than the longitudinal span threshold value so as to obtain a judgment result whether the span difference is not larger than the span threshold value.
And the offset distance judging unit is used for acquiring the offset distance of the two marked curves according to the offset distance acquisition rule and judging whether the offset distance is greater than the offset threshold value or not if the span difference is not greater than the span threshold value.
In one embodiment, the first verification unit 140 includes sub-units: a time series judging unit and a curve adjusting unit.
The time sequence judging unit is used for acquiring the time sequence corresponding to each labeling curve according to the time axis of each labeling curve and judging whether the two time sequences are unified or not; and the curve adjusting unit is used for adjusting the two marked curves to enable the time sequences of the two marked curves to be uniform if the two time sequences are not uniform.
And the path similarity judging unit is used for acquiring the path similarity of the two labeling curves according to the dynamic time warping algorithm and judging whether the path similarity is greater than the path similarity threshold value or not if the offset distance is greater than the offset threshold value.
And the similarity judging unit is used for calculating the similarity according to the similarity calculation formula, the span difference, the path similarity and the offset distance if the path similarity is not smaller than the path similarity threshold, and judging whether the similarity is larger than the similarity threshold so as to obtain the verification result of the to-be-verified labeling information as the same.
And a second verification unit 150, configured to, if the labeling curves in the two lesion labels are both closed curves, determine whether the closed regions corresponding to the two labeling curves are overlapped according to a preset overlap ratio determination rule, so as to obtain a verification result indicating whether the labeling information to be verified is consistent.
In one embodiment, the second verification unit 150 includes sub-units: the image processing device comprises a pixel image acquisition unit, a coincidence degree calculation unit and a coincidence degree judgment unit.
The pixel image acquisition unit is used for carrying out pixel filling on the closed areas corresponding to the two labeling curves to obtain two corresponding pixel images; the coincidence degree calculation unit is used for calculating corresponding coincidence degrees according to a coincidence degree calculation formula, overlapped pixels between the two pixel images and effective pixels corresponding to the two pixel images respectively; and the contact ratio judging unit is used for judging whether the contact ratio is not less than the contact ratio threshold value so as to obtain whether the closed areas corresponding to the two marking curves are overlapped so as to obtain the consistent verification result of the marking information to be verified.
A third verification unit 160, configured to, if labeling curves in the two lesion labels are a closed curve and a non-closed curve, determine whether pixels corresponding to the two labeling curves are overlapped according to a preset ratio threshold, so as to obtain a verification result indicating whether the labeling information to be verified is consistent.
The focus marking verification device provided by the embodiment of the invention adopts the focus marking verification method to obtain two focus marks of any image in medical image data as marking information to be verified, and judges whether the type information of the two focus marks is the same; if the two focus marks are the same, judging whether the marking curves in the two focus marks are non-closed curves; if the two marked curves are both non-closed curves, judging whether the two marked curves are similar according to a curve judgment rule based on a dynamic time warping algorithm to obtain a verification result of whether the two marked curves are consistent; if the two marked curves are closed curves, judging whether the closed areas of the two marked curves are overlapped according to an overlap ratio judgment rule to obtain a verification result of whether the closed areas of the two marked curves are overlapped; and if the two marked curves are respectively a closed curve and a non-closed curve, judging whether the pixels corresponding to the two marked curves are overlapped or not according to the proportion threshold value to obtain a verification result of whether the pixels are consistent or not. By the method, the consistency of the lesion marking can be verified by adopting the unified standard, and the efficiency and the quality of verifying the lesion marking can be greatly improved.
The above-mentioned lesion marking verification apparatus may be implemented in the form of a computer program, which may be run on a computer device as shown in fig. 10.
Referring to fig. 10, fig. 10 is a schematic block diagram of a computer device according to an embodiment of the present invention. The computer device may be a management server for performing a method of verification of lesion marking to verify consistency of lesion marking.
Referring to fig. 10, the computer device 500 includes a processor 502, memory, and a network interface 505 connected by a system bus 501, where the memory may include a non-volatile storage medium 503 and an internal memory 504.
The non-volatile storage medium 503 may store an operating system 5031 and a computer program 5032. The computer program 5032, when executed, causes the processor 502 to perform a method of lesion marking verification.
The processor 502 is used to provide computing and control capabilities that support the operation of the overall computer device 500.
The internal memory 504 provides an environment for running the computer program 5032 in the non-volatile storage medium 503, and when the computer program 5032 is executed by the processor 502, the processor 502 may be caused to execute a lesion marking verification method.
The network interface 505 is used for network communication, such as providing transmission of data information. Those skilled in the art will appreciate that the configuration shown in fig. 10 is a block diagram of only a portion of the configuration associated with aspects of the present invention and is not intended to limit the computing device 500 to which aspects of the present invention may be applied, and that a particular computing device 500 may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
The processor 502 is configured to run the computer program 5032 stored in the memory to implement the corresponding functions in the lesion marking verification method.
Those skilled in the art will appreciate that the embodiment of a computer device illustrated in fig. 10 does not constitute a limitation on the specific construction of the computer device, and that in other embodiments a computer device may include more or fewer components than those illustrated, or some components may be combined, or a different arrangement of components. For example, in some embodiments, the computer device may only include a memory and a processor, and in such embodiments, the structures and functions of the memory and the processor are consistent with those of the embodiment shown in fig. 10, and are not described herein again.
It should be understood that, in the embodiment of the present invention, the Processor 502 may be a Central Processing Unit (CPU), and the Processor 502 may also be other general-purpose processors, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components, and the like. Wherein a general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
In another embodiment of the invention, a computer-readable storage medium is provided. The computer readable storage medium may be a non-volatile computer readable storage medium. The computer readable storage medium stores a computer program, wherein the computer program, when executed by a processor, implements the steps included in the method for validating a lesion marking described above.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described apparatuses, devices and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again. Those of ordinary skill in the art will appreciate that the elements and algorithm steps of the examples described in connection with the embodiments disclosed herein may be embodied in electronic hardware, computer software, or combinations of both, and that the components and steps of the examples have been described in a functional general in the foregoing description for the purpose of illustrating clearly the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the embodiments provided by the present invention, it should be understood that the disclosed apparatus, device and method can be implemented in other ways. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only a logical division, and there may be other divisions when the actual implementation is performed, or units having the same function may be grouped into one unit, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may also be an electric, mechanical or other form of connection.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment of the present invention.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention essentially contributes to the prior art, or all or part of the technical solution can be embodied in the form of a software product stored in a computer-readable storage medium, which includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned computer-readable storage media comprise: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a magnetic disk, or an optical disk.
While the invention has been described with reference to specific embodiments, the invention is not limited thereto, and various equivalent modifications and substitutions can be easily made by those skilled in the art within the technical scope of the invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (10)

1. A method for verifying a lesion marking is applied to a management server, wherein the management server is communicated with a plurality of user terminals, and the method comprises the following steps:
receiving medical image data added with focus marks from the user terminal, and acquiring two focus marks of any one image in the medical image data as mark information to be verified;
judging whether the type information of the two lesion marks in the marking information to be verified is the same;
if the type information of the two lesion marks is the same, judging whether the marked curve in the two lesion marks is a closed curve or not to obtain a curve type judgment result;
if the labeling curves in the two lesion labels are both non-closed curves, judging whether the two labeling curves are similar according to a preset curve judgment rule to obtain a verification result of whether the labeling information to be verified is consistent, wherein the curve judgment rule is a judgment rule based on a dynamic time warping algorithm;
if the labeling curves in the two lesion labeling curves are closed curves, judging whether the closed areas corresponding to the two labeling curves are overlapped according to a preset overlap ratio judgment rule to obtain a verification result of whether the labeling information to be verified is consistent;
if the labeling curves in the two lesion labeling curves are respectively a closed curve and a non-closed curve, judging whether pixels corresponding to the two labeling curves are overlapped according to a preset ratio threshold value so as to obtain a verification result whether the labeling information to be verified is consistent.
2. The method for verifying the lesion marking according to claim 1, wherein before receiving the medical image data with the lesion marking added from the user terminal and acquiring two lesion markings of any one image in the medical image data as marking information to be verified, the method further comprises:
and sending the medical image data without adding the focus marks to a plurality of user terminals to obtain the focus marks added by the plurality of user terminals by a plurality of corresponding users so as to obtain the medical image data with the added focus marks.
3. The method for verifying a lesion marking according to claim 1, wherein the determining whether a marking curve in two lesion markings is a closed curve to obtain a curve type determination result comprises:
determining a starting point and an end point of the labeling curve according to the acquisition time of the position point in the labeling curve;
and judging whether the distance difference between the starting point and the end point is smaller than a preset distance threshold value or not to obtain a judgment result whether the labeled curve is a closed curve or not.
4. The method for verifying the lesion marking according to claim 1, wherein the curve determination rule includes a path similarity threshold, a span threshold, an offset threshold, a dynamic time warping algorithm, a span acquisition rule, an offset distance acquisition rule, a similarity calculation formula, and a similarity threshold, and the determining whether two marking curves are similar to each other according to a preset curve determination rule to obtain a verification result indicating whether the marking information to be verified is consistent includes:
acquiring a span difference of the two marked curves according to the span acquisition rule, and judging whether the span difference is not greater than the span threshold value;
if the span difference is not greater than the span threshold, acquiring the offset distance of the two marked curves according to the offset distance acquisition rule, and judging whether the offset distance is greater than the offset threshold;
if the offset distance is greater than the offset threshold, acquiring the path similarity of the two labeled curves according to the dynamic time warping algorithm, and judging whether the path similarity is greater than the path similarity threshold;
if the path similarity is not smaller than the path similarity threshold, calculating to obtain the similarity according to the similarity calculation formula, the span difference, the path similarity and the offset distance, and judging whether the similarity is larger than the similarity threshold so as to obtain the verification result of the to-be-verified labeling information to be consistent.
5. The method of claim 4, wherein the span threshold comprises a transverse span threshold and a longitudinal span threshold, and the obtaining a span difference between the two labeling curves according to the span obtaining rule and determining whether the span difference is not greater than the span threshold comprises:
respectively acquiring the transverse span difference and the longitudinal span difference of the two marked curves according to the span acquisition rule;
and judging whether the transverse span difference is not larger than a transverse span threshold value or not and whether the longitudinal span difference is not larger than the longitudinal span threshold value or not so as to obtain a judgment result whether the span difference is not larger than the span threshold value or not.
6. The method of claim 4, wherein before the obtaining the path similarity of the two labeling curves according to the dynamic time warping algorithm, the method further comprises:
acquiring a time sequence corresponding to each labeling curve according to the time axis of each labeling curve and judging whether the two time sequences are unified or not;
and if the two time sequences are not unified, adjusting the two labeled curves to make the time sequences of the two labeled curves unified.
7. The method for verifying the lesion marking according to claim 1, wherein the coincidence degree determination rule includes a coincidence degree calculation formula and a coincidence degree threshold, and the determining whether the closed regions corresponding to the two marking curves are overlapped according to a preset coincidence degree determination rule to obtain a verification result indicating whether the marking information to be verified is consistent includes:
pixel filling is carried out on the closed areas corresponding to the two marked curves to obtain two corresponding pixel images;
calculating corresponding coincidence degrees according to a coincidence degree calculation formula, overlapped pixels between the two pixel images and effective pixels corresponding to the two pixel images respectively;
and judging whether the contact ratio is not less than the contact ratio threshold value or not to obtain whether the closed areas corresponding to the two marking curves are overlapped or not to obtain the verification result of the marking information to be verified is consistent.
8. A device for validating a lesion marking, comprising:
the system comprises a to-be-verified marking information acquisition unit, a to-be-verified marking information acquisition unit and a marking information processing unit, wherein the to-be-verified marking information acquisition unit is used for receiving medical image data added with focus marks from a user terminal and acquiring two focus marks of any one image in the medical image data as to-be-verified marking information;
the type information judging unit is used for judging whether the type information of the two lesion marks in the marking information to be verified is the same;
the curve type judging unit is used for judging whether the marked curves in the two lesion marks are closed curves or not to obtain a curve type judging result if the type information of the two lesion marks is the same;
the first verification unit is used for judging whether the two labeling curves are similar according to a preset curve judgment rule to obtain a verification result whether the labeling information to be verified is consistent or not if the labeling curves in the two lesion labels are both non-closed curves, wherein the curve judgment rule is a judgment rule based on a dynamic time warping algorithm;
the second verification unit is used for judging whether closed areas corresponding to the two labeling curves are overlapped according to a preset overlap ratio judgment rule if the labeling curves in the two lesion labels are closed curves so as to obtain a verification result of whether the labeling information to be verified is consistent;
and the third verification unit is used for judging whether pixels corresponding to the two labeling curves are overlapped or not according to a preset ratio threshold value if the labeling curves in the two lesion labeling curves are respectively a closed curve and a non-closed curve so as to obtain a verification result whether the labeling information to be verified is consistent or not.
9. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor when executing the computer program implements a method of lesion marking verification as claimed in any one of claims 1 to 7.
10. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program which, when executed by a processor, causes the processor to perform the method of validating a lesion marking according to any one of claims 1 to 7.
CN202011053234.4A 2020-09-29 2020-09-29 Method and device for verifying lesion marking, computer equipment and storage medium Active CN111932536B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202011053234.4A CN111932536B (en) 2020-09-29 2020-09-29 Method and device for verifying lesion marking, computer equipment and storage medium
PCT/CN2021/096198 WO2022068228A1 (en) 2020-09-29 2021-05-27 Lesion mark verification method and apparatus, and computer device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011053234.4A CN111932536B (en) 2020-09-29 2020-09-29 Method and device for verifying lesion marking, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN111932536A true CN111932536A (en) 2020-11-13
CN111932536B CN111932536B (en) 2021-03-05

Family

ID=73334760

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011053234.4A Active CN111932536B (en) 2020-09-29 2020-09-29 Method and device for verifying lesion marking, computer equipment and storage medium

Country Status (2)

Country Link
CN (1) CN111932536B (en)
WO (1) WO2022068228A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022068228A1 (en) * 2020-09-29 2022-04-07 平安国际智慧城市科技股份有限公司 Lesion mark verification method and apparatus, and computer device and storage medium

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117152144B (en) * 2023-10-30 2024-01-30 潍坊华潍新材料科技有限公司 Guide roller monitoring method and device based on image processing

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150078639A1 (en) * 2013-09-19 2015-03-19 Siemens Aktiengesellschaft Method for evaluating an medical examination
CN105404896A (en) * 2015-11-03 2016-03-16 北京旷视科技有限公司 Annotation data processing method and annotation data processing system
CN107563123A (en) * 2017-09-27 2018-01-09 百度在线网络技术(北京)有限公司 Method and apparatus for marking medical image
CN108932724A (en) * 2018-05-31 2018-12-04 杭州晓图科技有限公司 A kind of system automatic auditing method based on multi-person synergy image labeling
CN109509197A (en) * 2018-09-26 2019-03-22 沈阳东软医疗系统有限公司 A kind of method, apparatus, equipment and storage medium for dividing area-of-interest
CN110390667A (en) * 2019-06-18 2019-10-29 平安科技(深圳)有限公司 Lesion extracting method, device, equipment and storage medium based on eyeground OCT image
CN110853739A (en) * 2019-10-16 2020-02-28 平安科技(深圳)有限公司 Image management display method, device, computer equipment and storage medium
CN110867243A (en) * 2019-10-16 2020-03-06 平安科技(深圳)有限公司 Image annotation method, device, computer system and readable storage medium
CN110968761A (en) * 2019-11-29 2020-04-07 福州大学 Self-adaptive extraction method for webpage structured data
CN110991486A (en) * 2019-11-07 2020-04-10 北京邮电大学 Method and device for controlling quality of multi-person collaborative image annotation
CN111159167A (en) * 2019-12-30 2020-05-15 上海依图网络科技有限公司 Labeling quality detection device and method
CN111178590A (en) * 2019-12-09 2020-05-19 武汉光庭信息技术股份有限公司 Method and system for building optimization marking team
CN111324637A (en) * 2020-02-05 2020-06-23 北京工业大数据创新中心有限公司 Fault symptom searching method and system for industrial time sequence data

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111028924A (en) * 2019-10-21 2020-04-17 西安电子科技大学 Method and system for labeling medical image data in various forms
CN111932536B (en) * 2020-09-29 2021-03-05 平安国际智慧城市科技股份有限公司 Method and device for verifying lesion marking, computer equipment and storage medium

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150078639A1 (en) * 2013-09-19 2015-03-19 Siemens Aktiengesellschaft Method for evaluating an medical examination
CN105404896A (en) * 2015-11-03 2016-03-16 北京旷视科技有限公司 Annotation data processing method and annotation data processing system
CN107563123A (en) * 2017-09-27 2018-01-09 百度在线网络技术(北京)有限公司 Method and apparatus for marking medical image
CN108932724A (en) * 2018-05-31 2018-12-04 杭州晓图科技有限公司 A kind of system automatic auditing method based on multi-person synergy image labeling
CN109509197A (en) * 2018-09-26 2019-03-22 沈阳东软医疗系统有限公司 A kind of method, apparatus, equipment and storage medium for dividing area-of-interest
CN110390667A (en) * 2019-06-18 2019-10-29 平安科技(深圳)有限公司 Lesion extracting method, device, equipment and storage medium based on eyeground OCT image
CN110853739A (en) * 2019-10-16 2020-02-28 平安科技(深圳)有限公司 Image management display method, device, computer equipment and storage medium
CN110867243A (en) * 2019-10-16 2020-03-06 平安科技(深圳)有限公司 Image annotation method, device, computer system and readable storage medium
CN110991486A (en) * 2019-11-07 2020-04-10 北京邮电大学 Method and device for controlling quality of multi-person collaborative image annotation
CN110968761A (en) * 2019-11-29 2020-04-07 福州大学 Self-adaptive extraction method for webpage structured data
CN111178590A (en) * 2019-12-09 2020-05-19 武汉光庭信息技术股份有限公司 Method and system for building optimization marking team
CN111159167A (en) * 2019-12-30 2020-05-15 上海依图网络科技有限公司 Labeling quality detection device and method
CN111324637A (en) * 2020-02-05 2020-06-23 北京工业大数据创新中心有限公司 Fault symptom searching method and system for industrial time sequence data

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
CAMILLE KURTZ ETAL.: "A SEMANTIC FRAMEWORK FOR THE RETRIEVAL OF SIMILAR RADIOLOGICAL IMAGES BASED ON MEDICAL ANNOTATIONS", 《ICIP 2014》 *
胡平: "群智标注系统中质量管理设计与实现", 《计算机与网络安全》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022068228A1 (en) * 2020-09-29 2022-04-07 平安国际智慧城市科技股份有限公司 Lesion mark verification method and apparatus, and computer device and storage medium

Also Published As

Publication number Publication date
CN111932536B (en) 2021-03-05
WO2022068228A1 (en) 2022-04-07

Similar Documents

Publication Publication Date Title
CN102236899B (en) Method and device for detecting objects
CN111932533B (en) Method, device, equipment and medium for positioning vertebrae by CT image
CN111932536B (en) Method and device for verifying lesion marking, computer equipment and storage medium
CN110889826B (en) Eye OCT image focus region segmentation method, device and terminal equipment
CN111340756B (en) Medical image lesion detection merging method, system, terminal and storage medium
JP4515332B2 (en) Image processing apparatus and target area tracking program
CN110309060B (en) Detection method and device for updating identification algorithm, storage medium and computer equipment
CN114842003B (en) Medical image follow-up target pairing method, device and application
CN111768418A (en) Image segmentation method and device and training method of image segmentation model
WO2020168647A1 (en) Image recognition method and related device
CN111782529B (en) Test method and device for auxiliary diagnosis system, computer equipment and storage medium
CN110287767A (en) Can attack protection biopsy method, device, computer equipment and storage medium
CN113450329B (en) Microcirculation image blood vessel branch erythrocyte flow rate calculation method and system
CN110111382B (en) Irregular area calculation method and device, computer equipment and storage medium
US9483705B2 (en) Image processing device, image processing method, and image processing program
CN116130090A (en) Ejection fraction measuring method and device, electronic device, and storage medium
CN113658097B (en) Training method and device for fundus image quality enhancement model
CN117237435B (en) Tumor prognosis effect evaluation method, device, electronic equipment and storage medium
CN111784660B (en) Method and system for analyzing frontal face degree of face image
CN109767468B (en) Visceral volume detection method and device
CN111753723B (en) Fingerprint identification method and device based on density calibration
JP5144706B2 (en) Image processing device
CN110934565A (en) Method and device for measuring pupil diameter and computer readable storage medium
CN110689112A (en) Data processing method and device
CN112950582B (en) 3D lung focus segmentation method and device based on deep learning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20231114

Address after: Room 2601 (Unit 07), Qianhai Free Trade Building, No. 3048, Xinghai Avenue, Nanshan Street, Qianhai Shenzhen-Hong Kong Cooperation Zone, Shenzhen, Guangdong 518000

Patentee after: Shenzhen Ping An Smart Healthcare Technology Co.,Ltd.

Address before: 1-34 / F, Qianhai free trade building, 3048 Xinghai Avenue, Mawan, Qianhai Shenzhen Hong Kong cooperation zone, Shenzhen, Guangdong 518000

Patentee before: Ping An International Smart City Technology Co.,Ltd.

TR01 Transfer of patent right