CN115035057B - Aqueous humor cell concentration acquisition method, apparatus, storage medium and device for anterior chamber of eye - Google Patents

Aqueous humor cell concentration acquisition method, apparatus, storage medium and device for anterior chamber of eye Download PDF

Info

Publication number
CN115035057B
CN115035057B CN202210606965.XA CN202210606965A CN115035057B CN 115035057 B CN115035057 B CN 115035057B CN 202210606965 A CN202210606965 A CN 202210606965A CN 115035057 B CN115035057 B CN 115035057B
Authority
CN
China
Prior art keywords
gray
anterior chamber
pixel points
pixel point
anterior
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210606965.XA
Other languages
Chinese (zh)
Other versions
CN115035057A (en
Inventor
王晓春
惠博阳
周盛
王效宁
李泽萌
计建军
巩丽文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
TIANJIN EYE HOSPITAL
Institute of Biomedical Engineering of CAMS and PUMC
Original Assignee
TIANJIN EYE HOSPITAL
Institute of Biomedical Engineering of CAMS and PUMC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by TIANJIN EYE HOSPITAL, Institute of Biomedical Engineering of CAMS and PUMC filed Critical TIANJIN EYE HOSPITAL
Priority to CN202210606965.XA priority Critical patent/CN115035057B/en
Publication of CN115035057A publication Critical patent/CN115035057A/en
Application granted granted Critical
Publication of CN115035057B publication Critical patent/CN115035057B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30041Eye; Retina; Ophthalmic

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Biomedical Technology (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Computational Linguistics (AREA)
  • Software Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

The invention provides a method, a device, a storage medium and equipment for acquiring aqueous humor cell concentration of an anterior chamber of an eye. The method comprises the following steps: acquiring an anterior chamber region and a corresponding region range value from an ultrasonic image of a target anterior segment; noise reduction is carried out on the anterior chamber area, a plurality of gray pixel point areas divided by the noise-reduced pixel points are obtained, and the gray pixel point areas are determined to be high-brightness gray scale spots of the anterior chamber area; for each high-brightness gray-scale spot, taking the pixel points in the range of each high-brightness gray-scale spot as target pixel points, and obtaining the gray-scale difference value between each target pixel point and all adjacent pixel points; obtaining cell pixel points according to all gray level difference values corresponding to each target pixel point and a preset gray level difference threshold value; determining the number of cell pixels as the number of cells; thus, the aqueous humor cell concentration of the anterior chamber of the eye is obtained according to the ratio of the cell number to the regional range value, and the measurement accuracy of the aqueous humor cell concentration can be improved.

Description

Aqueous humor cell concentration acquisition method, apparatus, storage medium and device for anterior chamber of eye
Technical Field
The invention relates to the technical field of aqueous humor cell concentration detection, in particular to an aqueous humor cell concentration acquisition method, an aqueous humor cell concentration acquisition device, an aqueous humor storage medium and aqueous humor cell concentration acquisition equipment for an anterior chamber of an eye.
Background
Aqueous humor cell concentration is a physiological parameter of the human eye. Normally, aqueous humor appears as a transparent clear liquid with little protein content and little cell presence due to the blood-aqueous barrier effect. When vascular permeability increases, a large amount of blood cells, proteins, or cellular material such as cellulosic exudates infiltrate into the aqueous humor. And the physiological parameter information of the aqueous humor cell concentration is obtained, so that whether the human eyes infiltrate cell substances or not and the infiltration condition of the human eyes can be known. However, in the prior art, there is a technical problem of low detection accuracy for the measurement accuracy of aqueous humor cell concentration.
Disclosure of Invention
The invention aims to overcome the defects and shortcomings in the prior art and provide a method, a device, a storage medium and equipment for acquiring the aqueous humor cell concentration of the anterior chamber of an eye, which can improve the measurement accuracy of the aqueous humor cell concentration of the anterior chamber of the eye.
One embodiment of the present invention provides a method for obtaining aqueous humor cell concentration of an anterior chamber of an eye, comprising:
Acquiring an ultrasonic image of a front section of a target eye;
acquiring an anterior chamber region and a corresponding region range value from the ultrasonic image;
denoising the anterior chamber region to obtain a plurality of gray pixel point regions divided by the denoised pixel points, and determining the gray pixel point regions as high-brightness gray scale spots of the anterior chamber region; the gray pixel point area is an aggregation area of a plurality of pixel points with the gray values still larger than 0 after noise reduction; the high-brightness gray scale spots are formed by the scattering effect of ultrasonic waves on cells in the anterior chamber region of the eye;
for each high-brightness gray-scale spot, taking a pixel point in the range of each high-brightness gray-scale spot as a target pixel point, traversing the target pixel point, and obtaining gray-scale difference values of each target pixel point and all adjacent pixel points;
obtaining cell pixel points according to all the gray level difference values corresponding to the target pixel points and a preset gray level difference threshold value;
determining the number of the cell pixel points as the number of cells;
and obtaining the aqueous humor cell concentration of the anterior chamber of the eye according to the ratio of the cell number to the regional range value.
One embodiment of the present invention also provides an aqueous humor cell concentration obtaining apparatus, including:
the image acquisition module is used for acquiring an ultrasonic image of the anterior segment of the target eye;
the regional scope acquisition module is used for acquiring an anterior chamber region and a corresponding regional scope value from the ultrasonic image;
the high-brightness gray-scale spot acquisition module is used for carrying out noise reduction on the anterior chamber area to obtain a plurality of gray-scale pixel point areas divided by the noise-reduced pixel points, and determining the gray-scale pixel point areas as high-brightness gray-scale spots of the anterior chamber area; the gray pixel point area is an aggregation area of a plurality of pixel points with the gray values still larger than 0 after noise reduction; the high-brightness gray scale spots are formed by the scattering effect of ultrasonic waves on cells in the anterior chamber region of the eye;
the gray difference value acquisition module is used for taking the pixel points in the range of each high-brightness gray scale spot as target pixel points, traversing the target pixel points and acquiring gray difference values of each target pixel point and all adjacent pixel points;
the cell pixel acquisition module is used for acquiring cell pixel points according to all the gray level difference values corresponding to the target pixel points and a preset gray level difference threshold value;
The cell number acquisition module is used for determining the number of the cell pixel points as the cell number;
and the aqueous humor cell concentration calculating module is used for obtaining the aqueous humor cell concentration of the anterior chamber of the eye according to the ratio of the cell number to the regional range value.
An embodiment of the present invention also provides a computer-readable storage medium storing a computer program which, when executed by a processor, implements the steps of the aqueous humor cell concentration acquisition method of the anterior chamber of an eye as described above.
An embodiment of the invention also provides an electronic device comprising a memory, a processor and a computer program stored in the memory and executable by the processor, the processor implementing the steps of the aqueous humor cell concentration acquisition method of the anterior chamber of the eye as described above when the computer program is executed.
Compared with the prior art, the aqueous humor cell concentration obtaining method of the anterior chamber of the eye obtains high-brightness gray scale spots divided by the pixel points after noise reduction by noise reduction through noise reduction of the anterior chamber area of the eye in the ultrasonic image, then takes the pixel points in the range of each high-brightness gray scale spot as target pixel points, obtains gray scale difference values of each target pixel point and all adjacent pixel points, obtains cell pixel points according to all the gray scale difference values corresponding to each target pixel point and a preset gray scale difference threshold value, and then determines the cell number according to the cell number, and calculates the aqueous humor cell concentration of the anterior chamber of the eye according to the cell number and the area range value of the anterior chamber area in the ultrasonic image, thereby realizing the technical effect of improving the measurement accuracy of the aqueous humor cell concentration of the anterior chamber of the eye.
In order that the invention may be more clearly understood, specific embodiments thereof will be described below with reference to the accompanying drawings.
Drawings
Fig. 1 is a flow chart of a method for aqueous humor cell concentration acquisition of the anterior chamber of an eye in accordance with one embodiment of the present invention.
Fig. 2 is a schematic diagram of the anterior ocular segment of a patient with aqueous humor turbidity, in accordance with an embodiment of the present invention.
Fig. 3 is an ultrasonic image of the anterior ocular segment of a patient with aqueous humor turbidity, in accordance with an embodiment of the present invention.
Fig. 4 is a schematic block diagram of an aqueous humor cell concentration obtaining apparatus of an anterior chamber of an eye according to an embodiment of the present invention.
100. Cornea; 300. iris, 500, lens; 1. an image acquisition module; 2. a regional range acquisition module; 3. a high-brightness gray-scale spot acquisition module; 4. a gray level difference value acquisition module; 5. a cell pixel acquisition module; 6. a cell number acquisition module; 7. and the aqueous humor cell concentration calculating module.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, the following detailed description of the embodiments of the present application will be given with reference to the accompanying drawings.
It should be understood that the described embodiments are merely some, but not all, of the embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the embodiments of the present application, are within the scope of the embodiments of the present application.
When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. In the description of this application, it should be understood that the terms "first," "second," "third," and the like are used merely to distinguish between similar objects and are not necessarily used to describe a particular order or sequence, nor should they be construed to indicate or imply relative importance. The specific meaning of the terms in this application will be understood by those of ordinary skill in the art as the case may be. As used in this application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. The word "if"/"if" as used herein may be interpreted as "at … …" or "at … …" or "in response to a determination".
Furthermore, in the description of the present application, unless otherwise indicated, "a plurality" means two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship.
Referring to fig. 1, a flowchart of a method for obtaining aqueous humor cell concentration of an anterior chamber of an eye according to an embodiment of the present invention includes:
s1: an ultrasound image of a target anterior segment of the eye is acquired.
The target anterior segment comprises tissue structures such as an anterior chamber, a posterior chamber, a zonules, an angle, a partial lens, a peripheral vitreous body, a retina and an extraocular muscle attachment point, a conjunctiva and the like of the eye.
The ultrasound image may be a two-dimensional image or a three-dimensional image. The ultrasonic image can be obtained by ultrasonic scanning of the front section of the target eye through an array transducer.
The two-dimensional image can be obtained by ultrasonic scanning of the front section of the target eye through the array transducer, and can also be obtained by screenshot from the three-dimensional image.
When the ultrasonic image to be acquired is a three-dimensional image, the method comprises the following steps:
s11: and scanning the target anterior ocular segment through an array transducer to obtain a pre-scanned image of the target anterior ocular segment.
The step S11 is to move on a preset movement track and scan the front section of the target eye, where the movement track is perpendicular to the line of sight in front of the front section of the target eye.
S12: a rotation centerline is determined from the midpoint of the array transducer and the midpoint of the anterior ocular segment tissue in the pre-scan image.
Specifically, when the array transducer scans the target anterior ocular segment, a moving route of the midpoint of the array transducer passes through the midpoint of anterior ocular segment tissue in the pre-scan image to establish a virtual line segment perpendicularly intersecting the moving route, wherein the virtual line segment is the rotation center line.
S13: and moving the array transducer to one side of the rotary neutral line, driving the array transducer to rotate around the rotary neutral line, acquiring panoramic information of the front section of the target eye, and constructing a three-dimensional image of the front section of the target eye according to the panoramic information. Preferably, the array transducer is rotated about the rotational centerline for at least one complete cycle of rotation as the target anterior ocular segment is scanned. The array transducer rotates in a single clockwise direction of rotation from a starting position about the center line of rotation until reaching the starting position again, indicating that the array transducer has rotated one complete cycle. Alternatively, the step S13 may use 2 or more array transducers with the same parameters for scanning.
And acquiring panoramic information of the front section of the target eye through rotary scanning, and constructing a three-dimensional image of the front section of the target eye on a three-dimensional space by combining a reconstruction mode of registration, interpolation and segmentation.
S2: and acquiring an anterior chamber region and a corresponding region range value from the ultrasonic image.
Referring to fig. 2-3, the anterior chamber region and corresponding region range values may be obtained from tissue structures surrounding the anterior chamber region in the ultrasound image, such as cornea 100, iris 200, lens 500, and the like.
In the ultrasonic image of the target anterior segment, if the aqueous humor in the anterior chamber area is clear and uniform and has no echo information, the image in the anterior chamber area can be displayed as a black image without high-brightness gray-scale spots, which indicates that no cellular material permeates into the aqueous humor in the anterior chamber area. If the aqueous humor in the anterior chamber region of the eye is infiltrated with cellular material, the image of the anterior chamber region of the eye may have high brightness gray scale spots due to scattering effects of the acoustic wave after contacting the cells. The cellular material is not necessarily limited to objects having cellular structures, including, but not limited to, cells, protein particles, cellulosic exudates, and the like, for example.
S3: denoising the anterior chamber region to obtain a plurality of gray pixel point regions divided by the denoised pixel points, and determining the gray pixel point regions as high-brightness gray scale spots of the anterior chamber region; the gray pixel point area refers to an aggregation area of a plurality of pixel points with the gray values still greater than 0 after noise reduction.
The noise reduction operation refers to reducing the gray value of the noise pixel point to 0. The noise pixel points may be pixel points having a gray value lower than a preset gray threshold.
The noise pixel has a gray value of 0 reduced by the noise reduction operation, and as a result, the pixels in the anterior chamber region are divided into two types, one type is a noise pixel having a gray value changed to 0 by the noise reduction operation, and the other type is a gray pixel having an original gray value, wherein the noise pixel is separated to isolate the gray pixel, thereby displaying a plurality of gray pixel regions formed by the aggregation of the gray pixels, and the gray pixel regions are the high-brightness gray spots. The high-brightness gray-scale spots are formed by scattering effect of ultrasonic waves on cells in the eye anterior chamber region, so that gray values of pixels corresponding to the high-brightness gray-scale spots are larger than the preset gray threshold, and are not affected by noise reduction operation, namely, gray-scale pixel areas formed by gathering gray-scale pixels which are not affected by the noise reduction operation are the high-brightness gray-scale spots.
S4: and for each high-brightness gray-scale spot, taking the pixel point in each high-brightness gray-scale spot range as a target pixel point, traversing the target pixel point, and obtaining the gray-scale difference value between each target pixel point and all adjacent pixel points.
The adjacent pixel point is a relative concept, and refers to other pixel points adjacent to the current target pixel point (or the current pixel point), and the adjacent pixel point may surround the corresponding current target pixel point (or the current pixel point). And the adjacent pixel points are not located in the high-brightness gray scale spots, that is, the adjacent pixel points can be the pixel points after the noise reduction treatment.
The gray difference has positive and negative values, for example: when the gray value of the target pixel point is smaller than that of an adjacent pixel point, the gray difference value between the target pixel point and the adjacent pixel point is a negative number smaller than 0, and when the gray value of the target pixel point is larger than that of an adjacent pixel point, the gray difference value between the target pixel point and the adjacent pixel point is a positive number larger than 0; when the gray value of the target pixel is equal to the gray value of an adjacent pixel, the gray difference between the target pixel and the adjacent pixel is 0.
When the ultrasonic image is a two-dimensional image, the total number of adjacent pixel points corresponding to the single target pixel point is 8, and when the ultrasonic image is a three-dimensional image, the total number of adjacent pixel points corresponding to the single target pixel point is 26.
S5: and obtaining the cell pixel points according to all the gray level difference values corresponding to the target pixel points and a preset gray level difference threshold value.
Specifically, in step S5, all the gray-scale differences corresponding to the target pixel points and a preset gray-scale difference threshold may be determined according to a preset determination rule, so as to determine a cellular pixel point in each pixel point of each high-brightness gray-scale spot. Wherein the cellular pixel points are used for representing a cellular material or a center of the cellular material.
S6: and determining the number of the cell pixel points as the number of cells.
Since the cell pixel is used to represent one cell substance or the center of one cell substance, the number of cells is equivalent to the cell pixel. Wherein the number of cells refers to the number of cell mass, and the cell mass is not necessarily limited to only the object having a cellular structure.
S7: and obtaining the aqueous humor cell concentration of the anterior chamber of the eye according to the ratio of the cell number to the regional range value.
Compared with the prior art, the aqueous humor cell concentration obtaining method of the anterior chamber of the eye obtains high-brightness gray scale spots divided by the pixel points after noise reduction by noise reduction through noise reduction of the anterior chamber area of the eye in the ultrasonic image, then takes the pixel points in the range of each high-brightness gray scale spot as target pixel points, obtains gray scale difference values of each target pixel point and all adjacent pixel points, obtains cell pixel points according to all the gray scale difference values corresponding to each target pixel point and a preset gray scale difference threshold value, and then determines the cell number according to the cell number, and calculates the aqueous humor cell concentration of the anterior chamber of the eye according to the cell number and the area range value of the anterior chamber area in the ultrasonic image, thereby realizing the technical effect of improving the measurement accuracy of the aqueous humor cell concentration of the anterior chamber of the eye.
In one possible embodiment, the ultrasound image is a two-dimensional image; the two-dimensional image is a sectional image through the pupil center of the anterior ocular segment; in step S2, the step of acquiring the anterior chamber area and the corresponding area range value from the ultrasound image includes:
S201: from the two-dimensional image, a corneal endothelial layer interface, an anterior iris interface, and an anterior lens capsule interface are identified.
The step S201 may be implemented by a pre-trained neural network interface recognition model, or may be implemented by determining a region range selected by the user in the two-dimensional image as the corneal endothelial layer interface, the pre-iris interface, or the pre-lens capsule interface, so as to implement an effect of recognizing the corneal endothelial layer interface, the pre-iris interface, or the pre-lens capsule interface.
S202: a two-dimensional closed region surrounded by the corneal endothelial layer interface, the anterior iris interface, and the anterior lens capsule interface is determined as the anterior chamber region.
After identifying the corneal endothelial layer interface, the iris anterior interface or the lens anterior capsule interface, the two-dimensional closed region surrounded by the corneal endothelial layer interface, the iris anterior interface and the lens anterior capsule interface can be further identified by a pre-trained neural network anterior chamber region identification model, thereby obtaining the anterior chamber region.
S203: the area of the anterior chamber region is determined as the region range value.
The area of the anterior chamber area can be calculated by the number of pixel points and the size ratio of the two-dimensional image to the actual cross section of the pupil center of the anterior segment.
In a two-dimensional image, the anterior chamber region may be identified by the corneal endothelial layer interface, the anterior iris interface, and the anterior lens capsule interface, so as to obtain an area parameter of the anterior chamber region in the two-dimensional image.
In one possible embodiment, the ultrasound image is a three-dimensional image; in step S2, the step of acquiring the anterior chamber area and the corresponding area range value from the ultrasound image includes:
s211: from the three-dimensional image, a corneal endothelial layer, an anterior iris interface, and an anterior lens capsule are identified.
The step S211 may be implemented by a pre-trained neural network recognition model, or may be implemented by determining a region range selected by the user in the two-dimensional image as the corneal endothelial layer, the pre-iris interface, or the anterior lens capsule, so as to implement an effect of recognizing the corneal endothelial layer, the pre-iris interface, or the anterior lens capsule.
S212: a three-dimensional closed region surrounded by the corneal endothelial layer, the anterior iris interface, and the anterior lens capsule is determined as the anterior chamber region.
After identifying the corneal endothelial layer, the iris anterior interface, or the anterior lens capsule, the three-dimensional closed region surrounded by the corneal endothelial layer, the iris anterior interface, and the anterior lens capsule may be further identified by a pre-trained neural network anterior chamber region identification model, thereby obtaining the anterior chamber region.
S213: the volume of the anterior chamber region is determined as the region range value.
The volume of the anterior chamber area can be calculated by the number of pixels and the size ratio of the three-dimensional image to the actual anterior segment.
In the three-dimensional image, the anterior chamber region may be identified by the corneal endothelial layer, the anterior iris interface and the anterior lens capsule so as to obtain a volumetric parameter of the anterior chamber region in the three-dimensional image.
In a possible embodiment, the step of denoising the anterior chamber area to obtain a plurality of gray pixel areas divided by the denoised pixel points, and determining the gray pixel areas as high-brightness gray-scale spots of the anterior chamber area includes:
s301: and acquiring an average gray value of the anterior chamber area in the ultrasonic image.
The average gray value of the anterior chamber area refers to the average gray value of all pixels within the range of the anterior chamber area.
S302: and determining the pixel points lower than the average gray value in the anterior chamber area as noise pixel points, and performing gray value reduction processing on the noise pixel points to reduce the gray value of the noise pixel points to a preset 0.
Since the gray value of the noise pixel point is generally lower and smaller than the average gray value in the ultrasonic image, the pixel point lower than the average gray value is determined as the noise pixel point, and the gray value is reduced, thereby realizing the effect of noise reduction.
S303: determining the pixel points higher than or equal to the average gray value in the anterior chamber area as gray pixel points; and determining the aggregation area of the gray pixel points isolated by the noise pixel points after noise reduction as the high-brightness gray-scale spots of the anterior chamber area.
Wherein, an aggregation area composed of continuous gray pixel points represents a high-brightness gray-scale spot.
In this embodiment, the noise is reduced to 0, so that the interference of the noise pixel on the high-brightness gray-scale spots can be removed, thereby obtaining the high-brightness gray-scale spots.
In a possible embodiment, the step of obtaining a cellular pixel according to all the gray-scale differences corresponding to each target pixel and a preset gray-scale difference threshold value includes:
s501: comparing all the gray level differences corresponding to the target pixel points with a preset gray level difference threshold, and determining the corresponding target pixel points as cell pixel points if all the gray level differences are larger than the gray level difference threshold.
In this embodiment, since the gray value of the pixel near the cellular pixel is formed by the back scattering of the sound wave contacting the cellular material, the gray difference of the pixel near the cellular pixel is large, and at this time, the cellular pixel can be confirmed by setting a verified gray difference threshold. Specifically, the gray scale difference threshold is determined by the user. In this embodiment, by comparing all the gray level differences corresponding to the target pixel with the gray level difference threshold, a cell pixel representing a cell or a cell center may be determined from the target pixel.
In order to prevent the situation that the cell material is just located between two pixel points and leads to misjudging the quantity of the cell material, in a possible embodiment, the step of obtaining the cell pixel points according to all the gray-scale differences corresponding to each target pixel point and a preset gray-scale difference threshold value includes:
S511: and comparing all the gray level difference values corresponding to the target pixel points with a preset gray level difference threshold value.
Wherein the gray scale difference threshold is set by a user. And comparing all the gray level difference values corresponding to the target pixel points with the gray level difference threshold value.
S512: if only one gray difference value is smaller than or equal to the gray difference threshold value in all the gray difference values of the same target pixel point, determining the corresponding target pixel point as a to-be-determined pixel point.
For example, taking a two-dimensional image as an example, if 7 gray differences among 8 gray differences corresponding to one target pixel point are greater than the gray difference threshold, and 1 gray difference is less than or equal to the gray difference threshold, the target pixel point is the pixel point to be determined.
S513: and acquiring the position relation of all the undetermined pixel points, and determining one pixel point in the adjacent undetermined pixel points as a cell pixel point.
Preferably, in the step S513, one pixel having a higher gray value among the adjacent undetermined pixels may be determined as a cellular pixel because the undetermined pixel having a higher gray value occupies a larger area or volume of the cellular material than the other undetermined pixel.
In this embodiment, through steps S511-S513, the accuracy of the statistics of the cell pixels can be improved, and the situation that the cell or the cell center is just located between two pixels and affects the statistics of the cell number is prevented.
Referring to fig. 4, an embodiment of the present invention further provides an aqueous humor cell concentration obtaining apparatus, including:
an image acquisition module 1 for acquiring an ultrasonic image of a front section of a target eye;
a region range obtaining module 2, configured to obtain an anterior chamber region and a corresponding region range value from the ultrasound image;
the high-brightness gray-scale spot acquisition module 3 is used for carrying out noise reduction on the anterior chamber area to obtain a plurality of gray-scale pixel point areas divided by the noise-reduced pixel points, and determining the gray-scale pixel point areas as high-brightness gray-scale spots of the anterior chamber area; the gray pixel point area is an aggregation area of a plurality of pixel points with the gray values still larger than 0 after noise reduction; the high-brightness gray scale spots are formed by the scattering effect of ultrasonic waves on cells in the anterior chamber region of the eye;
the gray level difference value obtaining module 4 is configured to, for each high-brightness gray level spot, take a pixel point in each high-brightness gray level spot range as a target pixel point, traverse the target pixel point, and obtain gray level difference values of each target pixel point and all adjacent pixel points;
The cellular pixel acquisition module 5 is configured to obtain cellular pixel points according to all the gray level difference values corresponding to the target pixel points and a preset gray level difference threshold value;
a cell number acquisition module 6 for determining the number of the cell pixels as a cell number;
and the aqueous humor cell concentration calculating module 7 is used for obtaining the aqueous humor cell concentration of the anterior chamber of the eye according to the ratio of the cell number to the regional range value.
The target anterior segment comprises tissue structures such as an anterior chamber, a posterior chamber, a zonules, an angle, a partial lens, a peripheral vitreous body, a retina and an extraocular muscle attachment point, a conjunctiva and the like of the eye.
The ultrasound image may be a two-dimensional image or a three-dimensional image.
In the ultrasonic image of the target anterior segment, if the aqueous humor in the anterior chamber area is clear and uniform and has no echo information, the image of the anterior chamber area can show a clear dark area, which indicates that no cellular material permeates into the aqueous humor in the anterior chamber area. If the aqueous humor in the anterior chamber region of the eye is infiltrated with cellular material, the image of the anterior chamber region of the eye may have high brightness gray scale spots due to scattering effects of the acoustic wave after contacting the cells. The cellular material is not necessarily limited to objects having cellular structures, including, but not limited to, cells, protein particles, cellulosic exudates, and the like, for example.
The anterior chamber region and the corresponding region range value may be obtained from a tissue structure surrounding the anterior chamber region in the ultrasound image, and may be, for example, an intracorneal, iris, lens, or the like.
The noise reduction operation refers to reducing the gray value of the noise pixel point to 0. The noise pixel points may be pixel points having a gray value lower than a preset gray threshold.
The adjacent pixel point is a relative concept, and refers to other pixel points adjacent to the current target pixel point (or the current pixel point), and the adjacent pixel point may surround the corresponding current target pixel point (or the current pixel point). And the adjacent pixel points are not located in the high-brightness gray scale spots, that is, the adjacent pixel points can be the pixel points after the noise reduction treatment.
The gray difference has positive and negative values, for example: when the gray value of the target pixel point is smaller than that of an adjacent pixel point, the gray difference value between the target pixel point and the adjacent pixel point is a negative number smaller than 0, and when the gray value of the target pixel point is larger than that of an adjacent pixel point, the gray difference value between the target pixel point and the adjacent pixel point is a positive number larger than 0; when the gray value of the target pixel is equal to the gray value of an adjacent pixel, the gray difference between the target pixel and the adjacent pixel is 0.
When the ultrasonic image is a two-dimensional image, the total number of adjacent pixel points corresponding to the single target pixel point is 8, and when the ultrasonic image is a three-dimensional image, the total number of adjacent pixel points corresponding to the single target pixel point is 26.
Since the cell pixel is used to represent one cell substance or the center of one cell substance, the number of cells is equivalent to the cell pixel. Wherein the number of cells refers to the number of cell mass, and the cell mass is not necessarily limited to only the object having a cellular structure.
Compared with the prior art, the aqueous humor cell concentration obtaining device for the anterior chamber of the eye obtains high-brightness gray scale spots divided by the pixel points after noise reduction by noise reduction through noise reduction of the anterior chamber area in the ultrasonic image, then takes the pixel points in the range of each high-brightness gray scale spot as target pixel points, obtains gray scale difference values of each target pixel point and all adjacent pixel points, obtains cell pixel points according to all the gray scale difference values corresponding to each target pixel point and a preset gray scale difference threshold value, and then determines the cell number according to the cell number, and calculates the aqueous humor cell concentration of the anterior chamber according to the cell number and the area range value of the anterior chamber area in the ultrasonic image, thereby realizing the technical effect of improving the measurement accuracy of the aqueous humor cell concentration of the anterior chamber.
In one possible embodiment, the ultrasound image is a two-dimensional image; the two-dimensional image is a sectional image through the pupil center of the anterior ocular segment; the regional scope acquisition module 2 includes:
and the first identification module is used for identifying the interface of the cornea inner cortex, the interface of the iris front and the interface of the lens front capsule from the two-dimensional image.
A first anterior chamber region acquisition module for determining a two-dimensional closed region surrounded by the corneal endothelial layer interface, the iris anterior interface, and the lens anterior capsule interface as the anterior chamber region.
And the first regional scope value acquisition module is used for determining the area of the anterior chamber region as the regional scope value.
After identifying the corneal endothelial layer interface, the iris anterior interface or the lens anterior capsule interface, the two-dimensional closed region surrounded by the corneal endothelial layer interface, the iris anterior interface and the lens anterior capsule interface can be further identified by a pre-trained neural network anterior chamber region identification model, thereby obtaining the anterior chamber region.
The area of the anterior chamber area can be calculated by the number of pixel points and the size ratio of the two-dimensional image to the actual cross section of the pupil center of the anterior segment.
In a two-dimensional image, the anterior chamber region may be identified by the corneal endothelial layer interface, the anterior iris interface, and the anterior lens capsule interface, so as to obtain an area parameter of the anterior chamber region in the two-dimensional image.
In one possible embodiment, the ultrasound image is a three-dimensional image; the regional scope acquisition module 2 includes:
and the second identification module is used for identifying the corneal endothelial layer, the iris anterior interface and the anterior lens capsule from the three-dimensional image.
A second anterior chamber region acquisition module for determining a three-dimensional closed region enclosed by the corneal endothelial layer, the anterior iris interface and the anterior lens capsule as the anterior chamber region.
And a second regional scope value acquisition module for determining the volume of the anterior chamber region as the regional scope value.
The second recognition module can be realized through a pre-trained neural network recognition model, and can also be used for determining the area range selected by a user in the two-dimensional image as the corneal endothelial layer, the iris anterior interface or the lens anterior capsule so as to realize the effect of recognizing the corneal endothelial layer, the iris anterior interface or the lens anterior capsule.
The volume of the anterior chamber area can be calculated by the number of pixels and the size ratio of the three-dimensional image to the actual anterior segment.
In the three-dimensional image, the anterior chamber region may be identified by the corneal endothelial layer, the anterior iris interface and the anterior lens capsule so as to obtain a volumetric parameter of the anterior chamber region in the three-dimensional image.
In one possible embodiment, the high brightness gray-scale speckle obtaining module 3 includes:
the average gray value acquisition module is used for acquiring the average gray value of the anterior chamber area in the ultrasonic image;
the noise reduction module is used for determining the pixel points lower than the average gray value in the anterior chamber area as noise pixel points, and carrying out gray value reduction processing on the noise pixel points to reduce the gray value of the noise pixel points to a preset 0;
the high-brightness gray-scale spot determining module is used for determining pixel points which are higher than or equal to the average gray-scale value in the anterior chamber area as gray-scale pixel points; and determining the aggregation area of the gray pixel points isolated by the noise pixel points after noise reduction as the high-brightness gray-scale spots of the anterior chamber area.
The average gray value of the anterior chamber area refers to the average gray value of all pixels within the range of the anterior chamber area.
Since the gray value of the noise pixel point is generally lower and smaller than the average gray value in the ultrasonic image, the pixel point lower than the average gray value is determined as the noise pixel point, and the gray value is reduced, thereby realizing the effect of noise reduction.
Wherein, an aggregation area composed of continuous gray pixel points represents a high-brightness gray-scale spot.
In this embodiment, the noise is reduced to 0, so that the interference of the noise pixel on the high-brightness gray-scale spots can be removed, thereby obtaining the high-brightness gray-scale spots.
In one possible embodiment, the cellular pixel acquisition module 5 comprises:
the first cell pixel point acquisition module compares all the gray level difference values corresponding to the target pixel points with a preset gray level difference threshold value, and if all the gray level difference values are larger than the gray level difference threshold value, the corresponding target pixel points are determined to be cell pixel points.
In this embodiment, since the gray value of the pixel near the cellular pixel is formed by the back scattering of the sound wave contacting the cellular material, the gray difference of the pixel near the cellular pixel is large, and at this time, the cellular pixel can be confirmed by setting a verified gray difference threshold. Specifically, the gray scale difference threshold is determined by the user. In this embodiment, by comparing all the gray level differences corresponding to the target pixel with the gray level difference threshold, a cell pixel representing a cell or a cell center may be determined from the target pixel.
In order to prevent the situation where the cell mass is located just between two pixel points, resulting in misjudging the cell mass quantity, in a possible embodiment, the cell pixel acquisition module 5 comprises:
and the gray level difference value comparison module is used for comparing all the gray level difference values corresponding to the target pixel points with a preset gray level difference threshold value.
And the undetermined pixel point acquisition module is used for determining the corresponding target pixel point as the undetermined pixel point if only one gray difference value in all the gray difference values of the same target pixel point is smaller than or equal to the gray difference threshold value.
The second cell pixel point acquisition module acquires the position relation of all the undetermined pixel points, and determines one pixel point in the adjacent undetermined pixel points as a cell pixel point.
An embodiment of the present invention also provides a computer-readable storage medium storing a computer program which, when executed by a processor, implements the steps of the aqueous humor cell concentration acquisition method of the anterior chamber of an eye as described above.
An embodiment of the invention also provides an electronic device comprising a memory, a processor and a computer program stored in the memory and executable by the processor, the processor implementing the steps of the aqueous humor cell concentration acquisition method of the anterior chamber of the eye as described above when the computer program is executed.
The above-described apparatus embodiments are merely illustrative, wherein the components illustrated as separate components may or may not be physically separate, and the components shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purposes of the present application. Those of ordinary skill in the art will understand and implement the present invention without undue burden.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks. These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart block or blocks and/or block diagram block or blocks.
In one typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include volatile memory in a computer-readable medium, random Access Memory (RAM) and/or nonvolatile memory, etc., such as Read Only Memory (ROM) or flash RAM. Memory is an example of a computer-readable medium.
Computer readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media for a computer include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device. Computer-readable media, as defined herein, does not include transitory computer-readable media (transmission media), such as modulated data signals and carrier waves.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article or apparatus that comprises an element.
The foregoing is merely exemplary of the present application and is not intended to limit the present application. Various modifications and changes may be made to the present application by those skilled in the art. Any modifications, equivalent substitutions, improvements, etc. which are within the spirit and principles of the present application are intended to be included within the scope of the claims of the present application.

Claims (8)

1. A method for obtaining aqueous humor cell concentration of an anterior chamber of an eye, comprising:
acquiring an ultrasonic image of a front section of a target eye;
acquiring an anterior chamber region and a corresponding region range value from the ultrasonic image;
Denoising the anterior chamber region to obtain a plurality of gray pixel point regions divided by the denoised pixel points, and determining the gray pixel point regions as high-brightness gray scale spots of the anterior chamber region; the gray pixel point area is an aggregation area of a plurality of pixel points with the gray values still larger than 0 after noise reduction;
for each high-brightness gray-scale spot, taking a pixel point in the range of each high-brightness gray-scale spot as a target pixel point, traversing the target pixel point, and obtaining gray-scale difference values of each target pixel point and all adjacent pixel points;
obtaining cell pixel points according to all the gray level difference values corresponding to the target pixel points and a preset gray level difference threshold value;
determining the number of the cell pixel points as the number of cells;
obtaining the aqueous humor cell concentration of the anterior chamber of the eye according to the ratio of the cell number to the regional range value;
the step of obtaining the cell pixel point according to all the gray level difference values corresponding to the target pixel points and a preset gray level difference threshold value comprises the following steps:
comparing all the gray difference values corresponding to the target pixel points with a preset gray difference threshold value, and determining the corresponding target pixel points as cell pixel points if all the gray difference values are larger than the gray difference threshold value;
If only one gray difference value is smaller than or equal to the gray difference threshold value in all the gray difference values of the same target pixel point, determining the corresponding target pixel point as a to-be-determined pixel point;
and acquiring the position relation of all the undetermined pixel points, and determining one pixel point in the adjacent undetermined pixel points as a cell pixel point.
2. The method for obtaining aqueous humor cell concentration of anterior chamber of eye according to claim 1, wherein: the step of denoising the anterior chamber region to obtain a plurality of gray pixel point regions divided by the denoised pixel points, and determining the gray pixel point regions as high-brightness gray scale spots of the anterior chamber region comprises the following steps:
acquiring an average gray value of the anterior chamber region in the ultrasonic image;
determining a pixel point lower than the average gray value in the anterior chamber area as a noise pixel point, and performing gray value reduction processing on the noise pixel point to reduce the gray value of the noise pixel point to a preset 0;
determining the pixel points higher than or equal to the average gray value in the anterior chamber area as gray pixel points; and determining the aggregation area of the gray pixel points isolated by the noise pixel points after noise reduction as the high-brightness gray-scale spots of the anterior chamber area.
3. The aqueous humor cell concentration acquisition method of an anterior chamber of an eye according to any one of claims 1 to 2, wherein the ultrasonic image is a two-dimensional image; the two-dimensional image is a sectional image through the pupil center of the anterior ocular segment;
the step of acquiring an anterior chamber region and a corresponding region range value from the ultrasound image includes:
identifying a corneal endothelial layer interface, an iris anterior interface, and a lens anterior capsule interface from the two-dimensional image;
determining a two-dimensional closed region surrounded by the corneal endothelial layer interface, the anterior iris interface, and the anterior lens capsule interface as the anterior chamber region;
the area of the anterior chamber region is determined as the region range value.
4. The aqueous humor cell concentration acquisition method of an anterior chamber of an eye according to any one of claims 1 to 2, wherein the ultrasonic image is a three-dimensional image;
the step of acquiring an anterior chamber region and a corresponding region range value from the ultrasound image includes:
identifying, from the three-dimensional image, a corneal endothelial layer, an anterior iris interface, and an anterior lens capsule;
determining a three-dimensional closed region surrounded by the corneal endothelial layer, the anterior iris interface and the anterior lens capsule as the anterior chamber region;
The volume of the anterior chamber region is determined as the region range value.
5. An aqueous humor cell concentration obtaining apparatus, comprising:
the image acquisition module is used for acquiring an ultrasonic image of the anterior segment of the target eye;
the regional scope acquisition module is used for acquiring an anterior chamber region and a corresponding regional scope value from the ultrasonic image;
the high-brightness gray-scale spot acquisition module is used for carrying out noise reduction on the anterior chamber area to obtain a plurality of gray-scale pixel point areas divided by the noise-reduced pixel points, and determining the gray-scale pixel point areas as high-brightness gray-scale spots of the anterior chamber area; the gray pixel point area is an aggregation area of a plurality of pixel points with the gray values still larger than 0 after noise reduction;
the gray difference value acquisition module is used for taking the pixel points in the range of each high-brightness gray scale spot as target pixel points, traversing the target pixel points and acquiring gray difference values of each target pixel point and all adjacent pixel points;
the cell pixel acquisition module is used for acquiring cell pixel points according to all the gray level difference values corresponding to the target pixel points and a preset gray level difference threshold value;
The cell number acquisition module is used for determining the number of the cell pixel points as the cell number;
the aqueous humor cell concentration calculating module is used for obtaining the aqueous humor cell concentration of the anterior chamber of the eye according to the ratio of the cell number to the regional range value;
wherein, the cell pixel acquisition module is used for:
comparing all the gray difference values corresponding to the target pixel points with a preset gray difference threshold value, and determining the corresponding target pixel points as cell pixel points if all the gray difference values are larger than the gray difference threshold value;
if only one gray difference value is smaller than or equal to the gray difference threshold value in all the gray difference values of the same target pixel point, determining the corresponding target pixel point as a to-be-determined pixel point;
and acquiring the position relation of all the undetermined pixel points, and determining one pixel point in the adjacent undetermined pixel points as a cell pixel point.
6. The aqueous humor cell concentration obtaining apparatus according to claim 5, wherein the high-brightness gray-scale speckle obtaining module includes:
the average gray value acquisition module is used for acquiring the average gray value of the anterior chamber area in the ultrasonic image;
The noise reduction module is used for determining the pixel points lower than the average gray value in the anterior chamber area as noise pixel points, and carrying out gray value reduction processing on the noise pixel points to reduce the gray value of the noise pixel points to a preset 0;
the high-brightness gray-scale spot determining module is used for determining pixel points which are higher than or equal to the average gray-scale value in the anterior chamber area as gray-scale pixel points; and determining the aggregation area of the gray pixel points isolated by the noise pixel points after noise reduction as the high-brightness gray-scale spots of the anterior chamber area.
7. A computer-readable storage medium storing a computer program, characterized in that: the computer program, when executed by a processor, implements the steps of the aqueous humor cell concentration acquisition method of the anterior chamber of the eye according to any one of claims 1 to 4.
8. An electronic device, characterized in that: comprising a memory, a processor, a computer program stored in the memory and executable by the processor, when executing the computer program, performing the steps of the aqueous humor cell concentration acquisition method of the anterior chamber of the eye according to any one of claims 1 to 4.
CN202210606965.XA 2022-05-31 2022-05-31 Aqueous humor cell concentration acquisition method, apparatus, storage medium and device for anterior chamber of eye Active CN115035057B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210606965.XA CN115035057B (en) 2022-05-31 2022-05-31 Aqueous humor cell concentration acquisition method, apparatus, storage medium and device for anterior chamber of eye

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210606965.XA CN115035057B (en) 2022-05-31 2022-05-31 Aqueous humor cell concentration acquisition method, apparatus, storage medium and device for anterior chamber of eye

Publications (2)

Publication Number Publication Date
CN115035057A CN115035057A (en) 2022-09-09
CN115035057B true CN115035057B (en) 2023-07-11

Family

ID=83122255

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210606965.XA Active CN115035057B (en) 2022-05-31 2022-05-31 Aqueous humor cell concentration acquisition method, apparatus, storage medium and device for anterior chamber of eye

Country Status (1)

Country Link
CN (1) CN115035057B (en)

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104794710A (en) * 2015-04-13 2015-07-22 上海泽煜实验设备有限公司 Image processing method and device
CN104794711B (en) * 2015-04-13 2018-06-05 上海泽煜实验设备有限公司 A kind of image processing method and device
CN107527028B (en) * 2017-08-18 2020-03-24 深圳乐普智能医疗器械有限公司 Target cell identification method and device and terminal
CN109461165A (en) * 2018-09-29 2019-03-12 佛山市云米电器科技有限公司 Kitchen fume concentration based on the segmentation of three color of image divides identification method
CN111340752A (en) * 2019-12-04 2020-06-26 京东方科技集团股份有限公司 Screen detection method and device, electronic equipment and computer readable storage medium
CN111798467B (en) * 2020-06-30 2024-05-03 中国第一汽车股份有限公司 Image segmentation method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN115035057A (en) 2022-09-09

Similar Documents

Publication Publication Date Title
US11989877B2 (en) Method and system for analysing images of a retina
WO2020147263A1 (en) Eye fundus image quality evaluation method, device and storage medium
US11284792B2 (en) Methods and systems for enhancing microangiography image quality
AU2019340215A1 (en) Methods and systems for ocular imaging, diagnosis and prognosis
CN109785399B (en) Synthetic lesion image generation method, device, equipment and readable storage medium
JP2023520001A (en) Correction of Flow Projection Artifacts in OCTA Volumes Using Neural Networks
Heslinga et al. Quantifying graft detachment after Descemet's membrane endothelial keratoplasty with deep convolutional neural networks
CN115035057B (en) Aqueous humor cell concentration acquisition method, apparatus, storage medium and device for anterior chamber of eye
CN115039122A (en) Deep neural network framework for processing OCT images to predict treatment intensity
CN113951813A (en) Retinal blood vessel branch angle calculation method and device and electronic equipment
CN115330663A (en) Method for segmenting boundaries of scleral lens and tear lens in anterior segment OCT (optical coherence tomography) image
WO2019171986A1 (en) Image processing device, image processing method, and program
CN114170378A (en) Medical equipment, blood vessel and internal plaque three-dimensional reconstruction method and device
CN110490857B (en) Image processing method, image processing device, electronic equipment and storage medium
JP2019150345A5 (en)
CN109363722B (en) Method and device for suppressing motion artifact in color flow imaging
Samagaio et al. Optical coherence tomography denoising by means of a fourier butterworth Filter-Based approach
CN115908274A (en) Device, equipment and medium for detecting focus
JP2004267584A (en) Ultrasonic diagnostic equipment
WO2021193008A1 (en) Program, information processing method, information processing device, and model generation method
Janpongsri et al. Pseudo‐real‐time retinal layer segmentation for high‐resolution adaptive optics optical coherence tomography
CN112120666A (en) Lens refractive power measuring and calculating method, device, equipment and storage medium
CN110610147A (en) Blood vessel image extraction method, related device and storage equipment
CN117503043B (en) OCT-based defocus amount intelligent identification method and device
US20220342060A1 (en) System and method for ultrasound imaging

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant